View Sidebar

Post Tagged with: Big Data

The three V’s for successful Big Data adaptability programs

The three V’s for successful Big Data adaptability programs

Three Vs Big Data-1Research and development is part of every successful organization’s mantra and to fuel this research Big Data plays a very important role. And how Big Data analytics is used is determinant in the success or failure of an organization. In today’s time when there is so much information to process it is imperative to experiment with dynamic and variety of Big Data strategies so that you assimilate the best mix and match for sure shot success. A good mix of Big Data strategies is ruled by three major principles, the three V’s – Volume, variety and velocity which all function like a force to shoot your organization into the realms of success.

# Volume

A successful big data strategy should have perfect conscience of high volumes of data and experimentation with the products/services. If you want to go one step further, a Big Data strategy should always be coinciding with the big picture strategy of the organization itself so that after giving a shot at experimentation, the ideal plan is adopted. According to experts the best way is to run multiple strategies at one time, providing the team with whatever resources they want and hope that something really innovative comes up.

# Velocity

Time is the key to building successful products/services and velocity at which an idea is generated using Big Data is determinant factor. All that needs to be done is channelize all the ideas and suggestions into an innovation tunnel that finally filters out the winning product. This is where fast execution of plans comes in. After having a clear vision of what needs to be achieved the next step is to work on the Big Data along with the experimentation process. For this good tools in the operational systems are required so that time lag is minimal and this finally leads to an improvement in innovation cycle which leads to big ideas propping-up.

# Variety

It is a known truth that variety is the flavor of success and Big Data analytics in no different. If you have a variety of innovative ideas that could be something big, the chances are you are in fact going onto something big. The best strategy is to get an array of products/services and the potential customers to combine it with the collection of ideas in the innovation tunnel. Just to keep in mind, the best way is to get a clear picture of your USP and then get everything to revolve around it.

All these V’s help you to take advantage of Big Data in an optimal way to get competitive advantage over competitors and there is no denying that Big Data and rapid experimentation go hand in hand for successful innovation programs in organizations.

Obama demands review of Big Data industry in light of privacy

Obama demands review of Big Data industry in light of privacy

President Obama in his recent speech at the Justice Department has urged the National Security Agency (NSA) to improvise on security of networks to protect important user information.Obama Big Data

In his address Obama showed concern over security, especially because of a year full of privacy upheavals, thanks to the prominent Edward Snowden leaks, which exposed facts about NSA tapping foreign fiber optic cables and cracking encryption protocols of famous tech companies. Obama opined that such leaks have made the task of national security difficult.

Big companies likes Google and Facebook, based on recent events of information theft, have already strengthened their security systems. Understanding the position, Obama has advised the NSA to review the current role of its security apparatus and to understand how widespread future Internet surveillance programs need to be.

US government has understood the complexities of Big Data analytics and has realized the benefits enterprises have reaped with data interpretation at such large magnitude. President has ordered a comprehensive review of Big Data analytics and privacy, for this, a group of government officials will be constituted who will work in tandem with President’s Council of Advisors on Science and Technology and reach out to business leaders and privacy experts to understand how public and private sectors are facing the challenges imposed by Big Data.

The presidential working group will examine how private and public companies are collecting Big Data and how the collection of avalanche of data and its analysis for purposes besides intelligence and law enforcement is affecting privacy.

In his speech, Obama has guaranteed reforms in safeguarding the privacy of Americans by providing transparency and protecting personal information offline and online. Thus, prima-facie motive of reviewing the Big Data analytics and privacy is to comprehend how NSA can promote free flow of information consistently with both security and privacy, and to identify areas where reform in policies might be required to restrain Big Data technologies.

Identifying important information in Big Data to answer real world challenges

Identifying important information in Big Data to answer real world challenges

Over the past one year, knowingly or unknowingly, Big Data has become the biggest buzzword enterprises are finding hard to pass. Believe it or not, according to the current dependence on Big Data and its allied technologies, we can assume that Big Data is here to stay, and we all will have to use it to address our real world problems.

big-data

The way in which Big Data technologies have evolved in the real word enterprise goes on to show that even technologists and scientists who might have disparaged the word (Big Data) previously, will now be acknowledging it.

Like everything has loopholes, Big Data isn’t any different. Big Data problems are basically issues caused not because of the unavailability of data, but by the abundance of available data. There is so much influx of data that is rather impossible to know which piece of information is actually important and how different important information pieces can be put together for meaningful information.

Researching on a general way to understand complex systems and to answer the biggest question that Big Data can’t answer – ‘how to know what’s important in complex world?’ president and scientist of the New England Complex Systems Institute Yaneer Bar-Yam has devised a trick to identify patterns in largest scale of behavior. Bar-Yam has revealed his findings in the article titled “Beyond big data: Identifying important information for real world challenges”

According to Yaneer, to understand and address most social and biological challenges, it is important to frame a scientific inquiry with an idea to objectively conclude what is important or unimportant instead of amassing larger and larger sets of data.

Yaneer explains that the identified patterns of behavior determined from handful of information are the key to understanding a system and to inform how the behavior can be influenced in the future.

Yaneer Bar-Yam and his team have used the successful tested this approach by predicting various complex systems and real world challenges like market crashes, ethnic violence, food prices and many more biological and complex social systems.

Via: NECSI

Four high value use cases for Big Data, are you doing it right?

Four high value use cases for Big Data, are you doing it right?

With the ever increasing digitization, organizations are accumulating terabytes of data annually. Presently, most of this unstructured data goes unused, though it is being retained for regulatory purposes. However, the current trend of data analytic suggests that know-how of Big Data can work in favor of enterprises in the long run.

high value use cases for Big Data

Big Data will play a significant role in the enterprises; but one question that surrounds authority of big data analytics is how it can actually be used to add value.

Few managers have mastered the tact of making decisions based on data analytics, something they would do based on their gut feeling until just a couple of years back. But since the influx of data is such that tradition data management systems cannot cope up with it, managers have become dependent on Big Data analysts to turn this avalanche of data into meaningful decisions.

Having understood applicable use of Big Data to an extent, it becomes imperative to know where Big Data will work within the enterprise and what problems Big Data can address.

Here we have listed four instances, which according to experts make Big Data analytics worth the investment for organizations. These are high value use cases of Big Data as found by IBM.

1.    Exploration of Big Data

The idea of Big Data exploration is to make companies research the existing transactions and repositories using Big Data techniques. This allows companies to accumulate data from various sources stored over different places in order to create a clear picture of available data and gain insights on how to use it for value results. Data exploration thus implies finding, understanding and visualizing Big Data to improve the quality of decision making. Big Data exploration basically addresses the business problem of storage of data in different systems by accumulating it in one place for all to see and analyze.

2.    Enhancing customer knowledge

Companies use Big Data to have a 360 degree view of their customers to understand and engage more personally with them. For example, telecom companies using phone data records and social media usage to understand behavior of a customer. Enhancing customer view enables enterprises to gain full understanding of the customer and then place goods and services based on their analysis.

3.    Extension of security

Big Data analytics can be used to detect fraud by analyzing credit card transactions, or detect terrorism and cyber crimes by monitoring data processing, phone calls, social media, emails etc. constantly. With Big Data analytics fraud and cyber security can be monitored in real time.

4.    Using Big Data for operations analysis

Connected gadgetry and Internet of Things is creating new data with great speeds. Smart gadgetry is contributing immensely to the data stream. Analyzing this avalanche of data can allow companies to improve performance. The abundance of data coming from sensors, GPS devices, IT machines etc. can be analyzed using Big Data for operations analysis to allow companies to attain real time insight of what’s what.

 

3 Big Problems Big Data Will Probably Create in Near Future

3 Big Problems Big Data Will Probably Create in Near Future

Big Data has undoubtedly been the biggest buzzword in the past one year. One can look back at the just concluded 2013 and consider it as the breakthrough year for the term Big Data.

Big_Data challengesBig Data may not be an outright term in innovation but it certainly is in awareness. In spite of the Big Data receiving more attention in the mainstream, there are business and individuals who still confuse the term and use it inappropriately.

All things said, business enterprises are investing big time in Big Data with the motive to have the best from advanced data analytics. As mobile data, internet data and cloud data trends multiply, a need for more sound Big Data adaptation platforms such as Hadoop have been felt. Though, real potential of Big Data is still very abstract to nail down, the ramifications and business challenges it will create have already begun to show from.

Let us read on for three most important problems Big Data analytics will probably create in the near future.

  1. 1.    Legal and privacy are risk issues

Big Data can be used for good, and obviously it can be harnessed for the betterment of the society. But it can also be abused! So, not everything is sunny about Big Data. Since the accumulation of data means more threat to privacy, privacy challenges around Big Data are nothing new. It may be the dark side of Big Data but an average consumer has begun to understand the implication.

This becomes challenge since enterprises use Big Data to benefit from advanced analytics. It is believed (and explained by Sand Hill survey) that almost 62 percent enterprises use Hadoop for advanced analytics it can provide.

In 2014, Big Data with the rise of Internet of Things, leading to more mobile data, drone data, sensory data and even image data is bound to create more legal concerns over Big Data privacy. This, as explained, because consumers are becoming more aware of the real impacts of Big Data on their lives. It is therefore important for enterprises to remain ahead with compliance law and keep themselves to date with changing data protection laws.

2.    Human decision making Vs. data-driven decision making

As more businesses pursue Big Data to drive their decision making, there is soon going to be a clash in ways of doing things. As MIT Sloan School of Management research scientist Andrew McAfee points out, most management education programs train employees to trust their gut. Trusting the gut feeling is the old way of decision making, so changing it with data-driven decision making can lead to conflict. Becoming data-driven will require businesses to undergo a paradigm shift, since whether the company is data driven or not will become the competitive differentiator between successful and not so successful businesses.

3.    Big Data used for discrimination

Many research projects based on the use of Big Data have raised concerns of data being used for discrimination in addition to looming privacy concerns.

Researchers including Kate Crawford of Microsoft suggest that Big Data is being used speedily for precise forms of discrimination. We are not new to discrimination, but Big Data creates a new form of automated discrimination. Researchers suggest that social media and health care are the most vulnerable.

To safeguard against the issue of discrimination, organizations can create transparent Big Data usage policies in order to protect consumer data.

 

Unabated Experimentation is Way Forward in Big Data

Unabated Experimentation is Way Forward in Big Data

Big Data Experimentation

While it is true that analytical modeling is calling for nonstop testing of big data, the equation isn’t that straightforward and holds certain potential challenges.

The need of the hour is active experimentation in the big-data zone to help in-progress analytical model to make precise correlations. But since statistical models have their own risks, their astute application is going to be a must, especially as long as we want the results to be positive.

While a few groups are still hesitant, most full-size organizations have been able to hone their insight to realize that big data calls for incessant experimentation, and are all in support for the alteration. They also know, at the same time, that practical scenario of the booming field of big data involves certain risks associated with statistical models, especially when their implementation is not flawless.

Statistical Modeling –Practicality and Risks

Statistical models are simplified tools employed by data science to recognize and validate all major correlative aspects at work in a particular field. They can, however, make data scientists have a fake sense of validation at times.

And despite fitting the observational data quite rightly, various such models have been found to miss the real major causative factors in action. This is why predictive validity is often missing in the delusion of insight offered by such a model!

What May go Wrong?

Even though the application of a statistical model is practical in business, there is always a need to scrutinize the true, fundamental causative factors.

The lack of confidence may prove to be the biggest risk, particularly when you doubt the relevancy of the standard (past) correlations constituting your statistical model in near future. And obviously, predictive model of product demand and customer response in a particular zone which you have low confidence in will never be able to pull in huge investments during a product launch!

What is the Scope?

Even though there are certain risks involved, statistical modeling can never be completely dead. To be able to detect causative factors more quickly and effectively, statistical modeling will need to be based on real-world experimentation. This innovative approach that employs a boundless series of real-world experiments will be highly helpful in making big data business model and economy more authentic and reliable.

So How’s Real-world Experimentation Going to Be Possible? 

Exactly the way data scientists have developed advanced operational functions for ceaseless experimentation, big organizations look forward to encouraging their expert business executives to lead the charge in terms of running nonstop experiments and for better output. And to add to their convenience, the big data revolution has already offered in-database platforms for proper execution of a model and economical yet high-output computing power to make real-world experimentation feasible everywhere including scientific and business domains.

The basic idea is to prefer spending time, capital and other resources to conduct more low-risk experiments to putting extra efforts building the same models back and back again!

Oracle Launches 5th Gen Database Machine

Oracle Launches 5th Gen Database Machine

Oracle Exadata Database Machine X4, the 5th gen database machine form Oracle is a revolutionary step in the field of database management. Keep on reading to know what it has to offer!

Oracle Launches 5th Gen Database MachineOracle recently launched Exadata Database Machine X4. It has hi-tech hardware and software that can increase capacity, boost performance and maintain quality and efficiency of service for database operation.

The update focuses mainly on the optimization of Online Transaction Processing (OLTP). The machine is mainly aimed at providing businesses with a permanent solution for all major database challenges and has advantages like Data Warehousing and Database as a Service (DBaaS).

Oracle Exadata Database Machine X4 – Features

  • The machine is the 5th gen of Oracle Exadata that was launched in the year 2008.
  • It is a fifth gen machine featuring improvements that focus on improved performance as well as quality of service for OLTP, Database as a Service and Data Warehousing.
  • It uses high speed flash compression and larger physical flash in perfect combination to increase the capacity of flash memory that eventually accelerates the performance of OLTP-based work.
  • The latest Flash Caching algorithms help accelerate the performance of all workloads in Data Warehousing.
  • Many databases can be merged with help of the Database as a Service design because of extreme capacity and performance. This will help businesses improve on quickness and more importantly, reduce costs.
Hadoop Security: Present and Future

Hadoop Security: Present and Future

Where the current level of Hadoop system can be relied upon for data protection and processing, there is still a need to improve Hadoop security to ensure foolproof big data security for coming times. To stay updated about the scope of a secure Hadoop cluster today and in times to come, one needs to know a few important things about it.

Hadoop Security Present and FutureSecurity is the foremost agenda that represents almost all major requirements within an organization, especially when it is about tasks like big-data processing. Hadoop registered a remarkable progress in last couple of years and has successfully addressed the most common worries like authorization, authenticity and above all, data protection. With more security enhanced Handoop clusters in the pipeline, though using the systems are banking upon the safety of all vital data in the future also.

Hadoop currently is engaged at the cutting edge to provide secure support to countless financial service applications and big private healthcare projects that operate in a high security-sensitive environment. Recent upgrades of Hadoop systems meet the key requirements of organizations demanding some of the world’s toughest security norms. With all the tight security controls incorporated in Handoop, the final objective remains flexibility and smooth data processing for now and in the future.

Hadoop Security Controls Dec 2013

 Security Controls for Hadoop at Present

Securing a Handoop cluster presents certain both small and big, which includes its distributed nature that to a large extent is even responsible for its success. For securing a system, a layered approach is the best and distribution happens to be one of the most complex barriers to it.

Following are the major layers that are in place to secure a cluster:

Authentication

It is responsible for verifying the identity of both a system and a user accessing it. Pseudo authentication and Kerberos are the two authentication modes Hadoop is providing. While the first takes care of the trust among users, the latter secures the overall Hadoop cluster.

Authorization

Authorization represents access freedom for users and a system. Hadoop relies on resource-level access control, file permissions in HDFS and offers authorization and a service-level access control.

Accounting

Accounting makes it possible to track resource use in a system. MapReduce and HDFS that are the parts of Apache Hadoop offer base audit support. Apache Oozie functions as a workflow engine and offers audit trail for all services.

Data Protection 

This takes care of privacy of information. HDP protects the data in motion and HDFS holds up encryption at operating-system levels.

Security Controls for Hadoop in Future

Newer innovations in Hadoop security are focusing mainly on making various security frameworks to work in collaboration so that they can be easily managed. Here’s what Hadoop security system is going to be big at:

Granular Authorization and Enhanced Authentication

Verification technique in most Hadoop modules is in the process of being improved. This is mainly developed and fortified mainly because most users are demanding security hardened authorization model. Token-based validation will soon replace Kerberos to enhance the authentication process.

Encryption Data Protection and Improved Accounting

A more advanced encryption algorithm is a must for most channels. The focus would be on better encryption, mostly through HBase, HDFS and Hive. Another important step is going to be high-tech audit record correlation for easier reporting. With this system, the auditor would be able to predict the sequence of Hadoop component operations without having to take help from any external tools.

Be Smart With Big Data

Be Smart With Big Data

smart dataSome companies get scared of big data. They think that since data is inherently dumb, a lot of it would be dumber still. But by being smart about big data, analysts can make sure that they get the most out of it. Handling big data can be a security risk and needs to be handled smartly.

The Present Way of Doing Things

Usually companies have one of three ways to handle data. They either go with the Heroic Model in which individuals take charge of requests and make decisions on their own without consulting with others. This model can work well for small businesses where individuals are usually aware of most situations across all areas of the business. But in bigger businesses, it can lead to confusion and chaos.

The Culture of Discipline on the other hand is one where individuals don’t make any decisions and follow a set of rules set by the management. Employees in this model can’t use data for their own decision making and just have to follow the processes set up for them.

The best way to handle data is to have a Data Smart Model in which data is managed on an evidence based management system. It is a combination of the first two methods and it works on a disciplined processing method but decision making is allowed at the individual level. This is the method that should be used to handle big data and it can result in smooth operation without much hassles.

How to Cultivate the Data Smart Culture

Certain steps need to be taken to create the data smart culture.

  • There should be a single source of truth. Decision making can be moved to the employee level but the guiding principles should be set from a single source.
  • Use ways to keep track of progress. Using a scorecard system, even on a daily basis, can help managers across different branches know how they are performing in relation to the other departments and they can then send in better data to record their progress.
  • Rules are important but there should be enough flexibility. Rules and guiding principles are needed but there should be flexibility to know when to bend the rules and when to break them. Sometimes what works in most parts of the country might not be best for a certain area. Businesses need to be able to adapt to such situations and change their rules accordingly.
  • Work on cultivating human resources. The people are the biggest asset of a company and it is important to educate them and provide them with the proper know-how to handle data. Managers need to be trained to educate the people working under them and give them a one to one engagement.

These steps can help businesses handle big data smartly and without much confusion. Every level needs to be trained to handle big data as the future is going to be all about big data.

9 Recent Surveys About Big Data

9 Recent Surveys About Big Data

big data surveyBig Data is the big word right now and many surveys have been conducted to find out just how big is Big Data. Take a look at the highlights of 9 such surveys to find out where big data is headed.

1. CompTIA

CompTIA, the IT association, surveyed 500 businesses and found out that:

  • 42% of businesses admitted that they have some type of big data initiative going on.
  • 93% said that data was critical for their business.
  • 18% thought that their business was ready for big data.

2. EMA and 9Sight Consulting

259 businesses and professionals were surveyed in this end user research. They found that:

  • 68% of the companies have at least 2 projects in their big data initiative.
  • 34% companies are using big data implementations in production.
  • 39% companies identified speeding operational time for analytics to be the number one driver for big data initiatives. Other drivers included competitive advantage with data use in business solutions (34%) and business requirements for higher levels of advanced analytics (31%).

3. Tech Pro Research

Tech Pro Research surveyed 144 businesses about the financial side of big data.

  • 54% have no interest in implementing big data initiatives. 8% have implemented some form of big data initiative, 12% are implementing and 26% are planning on implementation.
  • 82% of those who have implemented big data initiatives report seeing some form of payoff while only 4% believe they haven’t seen any benefits.

4. Gartner

They surveyed 720 members of the Gartner Research Circle.

  • 64% of companies are investing or planning to invest in big data.
  • But less than 8% have already deployed some form of initiative.

5. TEKsystems

They surveyed more than 2000 IT professionals and 1500 IT leaders. They found in their report:

  • 90% of IT leaders and 84% of IT professionals believe in big data as a good investment for time and money.
  • 14% of IT leaders said that big data is regularly applied in their businesses.
  • 66% of IT leaders and 53% of IT professionals said that their data is stored in disparate systems.
  • 60% of IT leaders said that their is no accountability for data quality.
  • At least 50% of IT leaders were not sure about the validity of their data.
  • 81% of IT leaders accepted that they do not have the adequate manpower with the right skill sets to implement big data initiatives.

6. Bain

Bain studied 400 large companies and found good results about big data.

  • They found that the companies that have big data analytics capabilities were outperforming their competition.
  • They were twice as likely to have better quarterly financial performance.
  • Five times more likely to make faster decisions.
  • Three times as likely to execute decisions.
  • And twice as likely to use data in their decision making process.

7. BCG

BCG surveyed 10,000 consumers in 20 countries worldwide. They found that:

  • 75% of consumers are concerned about privacy of their data.
  • The young generation is just as concerned about privacy as older generations.
  • Consumers will allow use of data as long as they trust the business with their data.

8. IBM

IBM studied 900 businesses from around the world. They found that the companies that were outperforming their peers were:

  • 166% more likely to make decisions based on data.
  • 2.2 times more likely to have a clear path for big data analytics in their organization.
  • driven by growth as the main source of value from data analytics.
  • measuring the impact of investment in analytics.

9. Forbes Market Insights

Rocket Fuel sponsored this study by Forbes Market Insights of 211 senior marketers.

  • The marketers that used big data at least half of the time in their campaigns said they exceeded their goals 3 out of 5 times.
  • Those who used data less than half of the time achieved similar results only 1 out of 3 times.
  • 92% of companies who used big data, exceeded their goals and only 5% fell short.

From such surveys it is clear that it is still early days for big data. Those who have taken the initiative have found dividends in big data while some still remain skeptical. Information taken from the article on Forbes.