BIG DATA

[vc_row full_width=”stretch_row_content_no_spaces”][vc_column][rev_slider_vc alias=”big”][/vc_column][/vc_row][vc_row full_width=”stretch_row”][vc_column][vc_column_text css=”.vc_custom_1488456551144{padding-left: 30px !important;}”]

The phrase Big Data comes from the computational sciences. Specifically, it is used to describe scenarios where the volume and variety of data types overwhelm the existing tools to store and process it. Big Data is term for collection of data sets so large and complex. It becomes difficult to process using on hand base management tools or traditional data processing applications.

VOLUME, VELOCITY AND VARIETY

VOLUME refers to the amount of data being generated. Think in terms of gigabytes, terabytes, and petabytes. Many systems and applications are just not able to store, let alone ingest or process, that much data.

VELOCITY refers to the rate at which new data is generated. Megabytes per second, gigabytes per second…Data is streaming in at unprecedented speed and must be dealt with in a timely manner in order to extract the maximum value.

VARIETY refers to the number of types of data being generated.

Challenges in Big Data

Capture, curation, storage, search, sharing, transfer, analysis and visualization.

Unstructured Data is Exploding.

System or enterprise generates huge amount of data – Tera bytes and peta bytes of data.[/vc_column_text][/vc_column][/vc_row]