Big data and Cloud solutions are becoming central to today’s business landscape, impacting everything from efficient planning/forecast to analytics –and effective execution in between.
Analyzing data became tedious and time consuming as there was a sudden surge in volume of data. It turned more complex with most of the data coming to system being unstructured and not easily manageable by structured systems like relational data bases. It could be a scenario where we want to draw inferences from complex interrelationships, such as when businesses want to know the sentiments of their customers by mining their social media behavioral patterns. The scenarios when the volume, velocity, variety challenges could quickly overwhelm the traditional methods and they may succumb to scalability issues.
Big data solutions suggest the best way to store, manage and analyse data. These solutions often require us to process data in one or more of the following spaces:
- Volume: Data gathered from multiple channels grows so big that it requires specialized platforms to perform parallel computing like Hadoop Distributed File System (HDFS) with its Map Reduce Engine.
- Velocity: Solutions in velocity space require accurate data modeling for maintaining user transactions with in-memory database. Often we solve this with multiple technologies in web stack backed by multi-tier database systems. Such solutions are termed as Polyglot.
- Variety: Solutions in this space involve structured, semi-structured and un-structured data and encompass real-time, analytical and search data.
Today, Cloud plays a pivotal role in the enterprise business dynamics. Cloud based BI solutions can help enterprises get more cost-effective business intelligence as it saves them from investing heavily in in-house IT expertise and infrastructure. In a Big data environment data gathered from multiple web portals grows so big that it requires specialized platforms to perform parallel computing like Hadoop Distributed File System (HDFS) with its Map Reduce Engine. This requires large computing infrastructures and that’s where cloud computing becomes very attractive because of its pay-as-you-go Model. By achieving scalability and imbibing flexibility in the required infrastructure, Cloud seems a natural fit for implementing Big Data systems.
Agile methods have boosted and transformed development of big and complex systems by reducing if not eliminating overheads like timeline delays, overshooting of budget etc. by a significant number. Agile methods speed up the development process while taking into account shifts in user requirements. At every cycle in an agile program, a prototype is built that can function individually as well as a block for large and more complex systems.
Cloud helps Agile Model
The secret of agile model is Customer collaboration, and Cloud has obvious advantage in facilitating this collaboration. It is because cloud enables deployment of early builds thereby giving real life experience to customer. The application can be deployed in a region which is close to customer’s location, thereby reducing network latency. These are obvious advantages of cloud over the traditional data centers.
Deploying early builds normally does not require heavy infrastructure. Your web servers, app server and data base computers may be thin instances. Thus the agile cycles with cloud deployment aligns development team with business and is also cost effective.
These intertwined technologies- Agile, Cloud and Big data, are emerging as a powerful combination. This ‘ABC’ design relationship (Agile-Big data-Cloud) can be leveraged by enterprises to make effective decisions and gain competitive advantage.
Delving into this theme, Shrikant Pattathil, Executive Vice President, Harbinger Systems, delivered an insightful session at Cloud Expo West 2013. It covered some real time case studies on how big data, cloud and agile can be leveraged to build great applications. Here is the slide deck of the speaker presentation. We hope you find this presentation helpful.