Disruptive Possibilities: How Big Data Changes Everything by Jeffrey Needham

By Jeffrey Needham

In Disruptive probabilities: How immense facts adjustments every little thing, Jeffrey Needham enlightens Fortune 500 firms concerning the colossal info surroundings as they start to channel their info from stranded silos into an obtainable reservoir of risk and discovery. This publication explains the place advertisement supercomputing got here from, in addition to its impression at the way forward for computing.

Show description

Read Online or Download Disruptive Possibilities: How Big Data Changes Everything PDF

Similar data modeling & design books

Modular Ontologies: Concepts, Theories and Techniques for Knowledge Modularization

This booklet constitutes a suite of study achievements mature adequate to supply a company and trustworthy foundation on modular ontologies. It provides the reader an in depth research of the state-of-the-art of the examine quarter and discusses the hot innovations, theories and methods for wisdom modularization.

Advances in Object-Oriented Data Modeling

Until eventually lately, info platforms were designed round diversified company services, similar to money owed payable and stock keep watch over. Object-oriented modeling, by contrast, buildings structures round the data--the objects--that make up many of the company services. simply because information regarding a selected functionality is restricted to 1 place--to the object--the procedure is protected against the results of swap.

Introduction To Database Management System

Designed particularly for a unmarried semester, first direction on database platforms, there are four features that differentiate our ebook from the remainder. simplicity - in general, the expertise of database platforms could be very obscure. There are

Extra info for Disruptive Possibilities: How Big Data Changes Everything

Sample text

Some businesses can afford to be risk averse, but most cannot. To mitigate risk, corporations employ many strate‐ gies that require some degree of calculated risk. Sometimes it is cal‐ culated very accurately with numbers and sometimes employees just have to make educated guesses. In preparation for big data, corporations need to optimize some of the computing systems that are now the heart and lungs of their business. Accurately analyzing the risk of component failure within any plat‐ form (hardware, software, humans, or competitors) is a key to opti‐ mizing the efficiency of those platforms.

If the infrastructure group is given the authority to behave unilaterally, it compromises the diplomatic mission. There is always a fine line between diplomacy, moral suasion and unilateralism. Done well, this group serves both the platform and business. Done poorly, this group ends up being just another silo. Other companies construct “tiger teams” by forcing subject matter experts from a number of different silos to work together temporarily. In contrast, when teams of specialists in the ’50s and ’60s needed to develop a working knowledge of those old mainframe systems, they were given the latitude and time to cross-pollinate their skills as spe‐ cialists in one area and generalists in others.

This approach to product development also tends to stifle innovation. If a franchise remains strong and can be enforced so captives can’t escape, vendors can still make a decent living. But no such franchise exists today in big data, even among established players like IBM, Oracle and EMC. Enterprise customers continue to purchase new warehouse products that promise to solve all their data problems only to have to move— yet again—all the data from the last failed platform to the new and improved one.

Download PDF sample

Rated 4.68 of 5 – based on 3 votes