Skip to main content

Featured post

Tennis-Paddle game

what Is Big Data?

OverView:

“90% of the world’s data was generated in the last few years.”
Due to the advent of new technologies, devices, and communication means like social networking sites, the amount of data produced by mankind is growing rapidly every year. The amount of data produced by us from the beginning of time till 2003 was 5 billion gigabytes. If you pile up the data in the form of disks it may fill an entire football field. The same amount was created in every two days in 2011, and in every ten minutes in 2013. This rate is still growing enormously. Though all this information produced is meaningful and can be useful when processed, it is being neglected.

What is Big Data?

Big Data is a collection of large datasets that cannot be processed using traditional computing techniques. It is not a single technique or a tool, rather it involves many areas of business and technology.

Thus Big Data includes huge volume, high velocity and an extensible variety of data. The data in it will be of three types.
 Structured data: Relational data.
 Semi-Structured data: XML data.
 Unstructured data: Word, PDF, Text, Media Logs.

Benefits of Big Data:
 Using the information kept in the social network like Facebook, the marketing agencies are learning about the response for their campaigns, promotions, and other advertising mediums.
  Using the information in the social media like preferences and product perception of their consumers, product companies and retail organisations are planning their production.
 Using the data regarding the previous medical history of patients, hospitals are providing better and quick service.

Big Data Technologies:
Big data technologies are important in providing more accurate analysis, which may lead to more concrete decision-making resulting in greater operational efficiencies, cost reductions, and reduced risks for the business.

To harness the power of big data, you would require an infrastructure that can manage and process huge volumes of structured and unstructured data in real-time and can protect data privacy and security.

There are various technologies in the market from different vendors including Amazon, IBM, Microsoft, etc., to handle big data. While looking into the technologies that handle big data, we examine the following two classes of technology:

Operational Big Data:

These include systems like MongoDB that provide operational capabilities for real-time, interactive workloads where data is primarily captured and stored.
NoSQL Big Data systems are designed to take advantage of new cloud computing architectures that have emerged over the past decade to allow massive computations to be run inexpensively and efficiently. This makes operational big data workloads much easier to manage, cheaper, and faster to implement.
Some NoSQL systems can provide insights into patterns and trends based on real-time data with minimal coding and without the need for data scientists and additional infrastructure.

Analytical Big Data:

These includes systems like Massively Parallel Processing (MPP) database systems and MapReduce that provide analytical capabilities for retrospective and complex analysis that may touch most or all of the data.
MapReduce provides a new method of analysing data that is complementary to the capabilities provided by SQL and a system based on MapReduce that can be scaled up from single servers to thousands of high and low-end machines.

These two classes of technology are complementary and frequently deployed together.

Big  Data  Challenges:
The major challenges associated with big data are as follows:
 Capturing data
 Curation
 Storage
 Searching
 Sharing
Transfer
Analysis
 Presentation
To fulfil the above challenges, organisations normally take the help of enterprise servers.




Comments

  1. Such a simple definition. Step-by step process about Hadoop Training in Chennai. I have planned to join Hadoop Training Chennai after readinf this blog

    ReplyDelete

Post a Comment

Thanks for your comments!

Popular posts from this blog

ANGULAR-2

Angular JS is an open source framework built on JavaScript. It was built by the
developers at Google. This framework was used to overcome obstacles encountered while
working with Single Page applications. Also, testing was considered as a key aspect while
building the framework. It was ensured that the framework could be easily tested. The
initial release of the framework was in October 2010.

Features of Angular 2:Components: The earlier version of Angular had a focus of Controllers but now has changed the focus to having components over controllers. Components help to
build the applications into many modules. This helps in better maintaining the
application over a period of time.

TypeScript: The newer version of Angular is based on TypeScript. This is a
superset of JavaScript and is maintained by Microsoft.

Services: Services are a set of code that can be shared by different components
of an application. So for example, if you had a data component that picked data
from a database, you …

What’s the difference between AngularJS, Angular2 and Angular4?

One question that often comes out is “What is the basic difference between AngularJS, Angular 2 and Angular 4 and how to jump from Angular 2 to Angular 4?”


Angular JS was introduced in 2010 as a JavaScript framework for building client side single page web applications. So it gained popularity and the Angular team at google started to add some more features to the core. But the framework was not designed with the needs of today’s applications in mind and moreover it was totally complex. So the Angular team decided to rewrite the entire framework using TYPESCRIPT and as a result Angular 2 was released in mid 2016. The new Angular framework is completely different from the previous version and we can think of it as a completely different framework compared to the earlier one.

The decision was frustrating to most of the developers since a lot of applications have been designed using AngularJS. I personally liked the direction that Angular developers took in rewriting the entire framework a…

Efficiency of an algorithm

Efficiency of an algorithm
Programmers find a tough time writing efficient code. But what exactly do we mean by efficiency? What difference does it make if we write normal code instead of an efficient code? Let’s find out with a simple example of traditional sorting problem.


The first, known as insertion sort, takes time roughly equal to c1*n2 to sort n items, where c1 is a constant that does not depend on n. That is, it takes time roughly proportional to n2. The second, merge sort, takes time roughly equal to c2*n lg n, where lg n stands for log2 n and c2 is another constant that also does not depend on n. Insertion sort usually has a smaller constant factor than merge sort, so that c1 < c2. We shall see that the constant factors can be far less significant in the running time than the dependence on the input size n.


    Where merge sort has a factor of lg n in its running time, insertion sort has a factor of n, which is much larger. Although insertion sort is usually faster than mer…