View Sidebar

Articles by: Steve

Big Data Startups Luring Huge Investments

Big Data Startups Luring Huge Investments

Big data startups baiting enormous investments in 2013 experts bet are going to redefine investors’ attitude toward them. Keep on reading for the most recent stats from 2013!

Big Data Investments May Pay Rich Dividends in Near FutureIn  2013 alone the startups that have been focusing on big data were able to pull in investments worth over $3.6 billion. The stats have simply surpassed all the presumptions and experts find it truly intriguing, for it is nearly three quarters of the total capital that the huge gap of almost five years from 2008 to 2012 had witnessed going into such companies!

As reveals the latest infographic from Big Data Startups, following big data startups have captured the most investments:

  • Cloudera is leading Hadoop distributing vendors and was reported to raise $65 million till December the last year
  • Palantir was reported to raise $100 million with a valuation of over $9 billion
  • Mu Sigma, a company dealing in analytics tools, had raised $108 million two years back and has now has Microsoft as one of its current customers!
  • MongoDB had declared a $150 million round till October
  • Opera Solutions, a predictive analytics provider, had grabbed $84 million in the year 2011
Cloud computing and big data enable real-time decisions

Cloud computing and big data enable real-time decisions

Big data analytics and cloud computing together have a potential to act as a huge springboard to real-time decision making. As big data and cloud become increasingly intervened, it is leading to more efficient and accurate results in real-time. 

A Close relationship between big data and cloud computing coupled with real-time processing capabilities will give birth to data analytics proficient in producing real-time results that can change the way companies build and market products. This sentiment was recently expressed by Amazon Chief Technology Officer, Werner Vogels while talking to The Guardian.

Cloud and BigData

Tools and technologies of big data offer new automated ways to condense large amounts of data into understandable format instantly. Unlike the conventional business intelligence, which had limitation of being futuristic, big data has the ability to not just analyze data about what has happened, but also has the potential to process currently produced data in real-time.

In Vogels’ view, big data and cloud computing share a very close relationship, since it requires no limits to store and compute. And when this ability is combined with real-time processing, data analytics will rise instantly to produce real-time results for companies using it. This will allow the enterprises to take real-time decisions based on real-time processing and analysis of data – business will be able to make decisions based on current status and not on the past information that was the case until.

Imagine a scenario where companies could take manufacturing and marketing decisions based on the most current data, which is stored in cloud and analyzed by big data tools. How effective will be the decisions that’ll be taken on the information available right now?

This is the kind of radical impact business decision making could experience if they can figure out ways to use cloud computing and big data in harmony.

It is evident that cloud and big data make a good team, considering the fact that you can store as much data as possible on cloud, as and when it comes. But there is a point where businesses will have to consider when this collaboration is becoming too expensive for them.

SiSense makes big data analytics possible on a Chip

SiSense makes big data analytics possible on a Chip

Thriving on the concept of ‘big data meets business intelligence,’ a company in Israel has worked out a way to allow enterprises to put analytics in cache memory of a CPU in order minimize cost of hardware purchase that enterprises incur. SiSense big data analytics on a chip If you were to ask someone, what is big data’s biggest drawback? They would outrightly claim it as ‘more hardware’meaning more expenditure! Since, organizations see their big data levels rise more than twice each year, they need ways to store this data. Not just store, but also process, search and analyze data from time to time. All this requires IT infrastructure, which involves a lot of cost. As an innovative stride forward to help organizations save considerably on this cost of hardware, SiSense, Big Data Analytics Companyhas designed a way to do analytics in the cache memory of Central Processing Unit – Analytics on a Chip i.e. The idea of rethinking business intelligence – with high speed and smaller sets on multicore processors began way back in April when SiSense received $10 million in Series B funding from Battery Ventures along with Opus Capital and Genesis Partners. SiSense has basically worked out a technique to make analytic software work on multicore processors, say a parallel computer cluster on a Chip. Explaining the efficiency of the system, SiSense CTO EldadFarkash says that their big data analytics on a chip system can run queries against almost 20 terabytes of data on the Chip. According to Farkash, their technology works on Intel and AMD multicore 64-bit architectures and is probably the first to support Intel’s new Haswell architecture. Explaining the applicable use of the technology in the future, Farkash says an analytics system like theirs would be something we will carry in the palm of our hands, it will be a system that will work on new age iPads and Android tablets with terabytes of storage. Seen from a broader prospective, SiSense could be a deviation from high-performance computing ofbig data on Hadoop Hive. This could really save enterprises a lot that they end up spending on maintaining their data and analyzing it using various platforms. SiSense makes for a good way to diverge big data analytics beyond Hadoop hives, your take? Via: InformationWeek

Big data to drop insurance costs for young drivers

Big data to drop insurance costs for young drivers

Car insurance firms in most parts of the world are using big data to monitor driver’s behavior and assist them in lowering insurance costs. Big data is also helping them improve their driving. 

Insurance firms like the Progressive in the United States, Generalli Insurance Group in Italy and Tesco Bank in the United Kingdom are employing different way to track their customers driving routine and are using this data to minimize insurance costs for better drivers.

The idea of monitoring individual drivers is to offer them lower insurances prices and also to make them better drivers. The Insurance companies believe that by observing directly how people drive, the firms will be able to change the way insurance works.

The Insurance companies have adapted this new technique of monitoring driver data since years old technique has just recently become affordable. In the first phase, firms for now are trying to convince customers that observing driving behavior is actually a good thing, and customers are only trying to get used to be tracked every second.

Big data telematic progressive

How this works

Insurer Progressive in the U.S boasts of more than a trillion seconds of driving data from a total of 1.6 million customers. To monitor every individual, the insurer installs their Snapshot device – dubbed telematic, in a person’s car to monitor every second data of the speed in which an individual drive, what time of the day they drive. The telematic beeps thrice when brakes are applied suddenly.

The idea behind installing telematic in the cars is simple, to train people to drive better. The Insurer believes observing individual’s driving can help change the way insurance works. They opine that youngster at 18 pay a lot for insurance, but there are some really good and safe drivers at 18 who deserve better deals.

New scope against traditional car insurance

Traditional car insurances are based on the concept of averages and not specifics. All you have to do is fill out a form stating your age, type of car you drive and other things like gender etc and based on this information, your risk is mapped and you are put into certain predetermined insurance slab. So in nutshell, thousands of people with have the same risk insurance, when their driving abilities are different, and risks are different.

Eventually, most people end up over paying for their car insurance!

The applications

With big data technologies like the telematic and other ways of analytics to collect data from social networks and other platforms, insurers will be able to augment risk profiles of individuals and better insurance costs based on this avalanche of data for them.

If data monitoring reveals that a driver drives less frequently, and drives safely, then the person should be able to save on his insurance cost (which should be lesser compared with someone who is similar in age and type of car ownership, but has a rash driving behavior). Using big data analytics is the easiest way to achieve this.

Currently however, only 2 percent of US car insurers are offing driving monitored insurance, but this is expected to rise by 10-15 percent in next five years. The big data analysis besides determining risk profiles can also help determine the driver at fault in an accident.

Source: BBC

Demand for Big Data analysts surges in UK

Demand for Big Data analysts surges in UK

Companies in the United Kingdom are seeking opportunity to take a giant leap ahead in global effort to deal with volume, velocity and variety of data generated each day. But UK companies are finding it hard to get employees with big data analytic and modeling skills that they require, thus escalating the demand for big data specialists.

Big data has made its presence felt and for all possible reasons, it is here to stay. Even technology specialists who previously brushed of big data as a buzzing phrase have acknowledged the importance of big data for enterprises.

big data analytics demand in UK

The global economy has been transforming with big data analytics and remodeling at a brisk pace. To make the most of this staggering data, one-third of large UK companies are geared to adapt big data technologies in the next five years. What is the implication of this paradigm shift on the demand for employees?

According to a joint report “Big Data Analytics: Adoption and employment trends” released by e-skills UK and SAS Institute, with rapid realization of big data technologies in organization in UK, there is a rise in demand to increase development of critical data analytic skills to meet requirement. The report suggests that there is an expected 243% increase in demand for big data specialists in UK by 2017. Almost 69,000 big data experts i.e. will be required by organization to make fact-based business decisions.

The report further reveals that in UK’s large organizations where big data technologies are being implemented, currently there are about 94 crore big data users. This number though is expected to push up by 177% by 2017.

Recognizing how rapidly the economy is transforming with big data technologies, UK’s Minister of State for Trade and Investment, Lord Green believes, this is UK’s best opportunity to deal with volume, velocity and variety of data in order to lead the global vision on big data. For this, Green says, UK’s government, business and academia will have to work in tandem to develop skills that’ll foster development.

Karen Price, CEO e-skills considers big data analytic skills of strategic importance for UK. He believes businesses and government need to give big data skills a strategic importance along with mobile computing, communication and cyber security etc., since these skills will be of utmost relevance in the near future.

When Facebook Concluded Largest Hadoop Data Migration Ever

When Facebook Concluded Largest Hadoop Data Migration Ever

Since the inception of Facebook in particular, days of storing massive data on servers are here. Data content being shared on the internet is growing enormously with every passing day and managing the same is becoming a problem for organizations across the globe.

When Facebook Concluded Largest Hadoop Data Migration EverFacebook recently undertook the largest data migration ever.  The Facebook infrastructure team moved dozens of petabites of data to a new a center – not easy, nonetheless a task well executed.

Over the past couple of years, the amount of data stored and processed by Facebook servers has grown exponentially, increasing the need for warehouse infrastructure and superior IT architecture.

Facebook stores its data on HDFS — the Hadoop distributed file system. In 2011, Facebook had almost 60 petabytes of data on Hadoop, which posed serious power and storage shortage issues. Geeks at Facebook were then compelled to move this data to a larger data center.

Data Move

The amount of content exchanged on Facebook daily has created a demand for a large team of data infrastructure management professionals. They will analyze all the data to give it out to in the quickest and most convenient way. The treatment of such large data requires large data centers.

So considering the amount of data that had piled up, Facebook’s infrastructure team just concluded the largest data migration ever. They moved petabytes of data to a new center.

This was the largest scale data migration ever. For this Facebook set up a replication system to mirror changes from smaller cluster to the larger cluster. This allowed all the files to be transferred.

First, the infrastructure team used the replication clusters to copy and transfer bulk data from the source to the destination cluster. Then the smaller files, Hive objects and user directories were copied onto the new server.

The process was complex, but since the replication clusters minimize downtime (time how quickly both old and new clusters can be brought to identical state), it became easy to transfer data on a large scale without a glitch.

Learning curve

According to Facebook, the infrastructure team has used a replication system like this one previously too. But, earlier, the clusters were smaller and could not accommodate the rate of data creation, which meant these clusters weren’t enough.

The team worked day in and day out for the data transfer. With the use of the replication approach, the migration of data became a seamless process.

Now, the team having transferred massive data to a bigger cluster means that Facebook can deliver absolutely relevant data to all users.

Big data education, a good way to kick start your career

Big data education, a good way to kick start your career

There has been a fair share of technological shifts in information and technology since its inception, but if experts are to be believed, there hasn’t been a more significant one than caused by the rise in big data. Considering this immense change, there has been a huge demand for data analysts, who can help manage big data for companies.

Big Data EducationBig data is an enormous increase in the data brought about by excessive usage of mobile devices, social media and new media platforms. If harnessed properly, big data is a great thing for organizations. Big data can enable organizations to make specific and accurate data-derived decisions which are based on the past and the expected behavior of prospective customers.

Organizations have realized the importance of big data and have begun to analyze large data sets for better business and operational understanding. But all this has only begun after the companies like Facebook and Google built custom databases to manage and process this movement in data.

Being immensely important, the big data system comprises of large software and unconventional databases which need special training to be used properly. This would come as no surprise that considering the importance remuneration and level of skill required managing this data, big data analyst is considered the best job in 2013.

There is a ever increasing group of engineering graduates who are enrolling themselves in universities offering courses in big data. To attract best engineering students’ universities like MIT and Harvard have begun offering courses in big data and data sciences. These courses have made the field of big data competitive and rewarding.

As the platform of data science heats up, there are increasing number of option that have crept up to gain education in this field. In addition to universities and colleges, a growing number of online learning platforms have come up that offer technical training and certification in big data science. Some of these platforms include Khan Academy, Coursera and Big Data University. These offer courses free of charge, which is a great advantage over high paid courses in the universities.

These online learning platforms are great mediums of education for professionals looking for time and cost effective way to acquire necessary training and skills for big data. These courses are online, so you don’t need to sacrifice your job or current profession to pursue them.

A considerable rise on big data science learning platforms offers a great potential to engineering students look to stay ahead in the new world of big data. These online platforms being free of cost also offer great opportunity to professional to pursue skills and remain ahead in the careers without having to compromise on their existing jobs.

5 Reasons it has become really important to Move to Big Data

5 Reasons it has become really important to Move to Big Data

Time when people and organizations worked with small amounts of data is yesterday. Data is getting bigger and bigger by the day; thus, people will have to find alternative ways to manage influx of data – big data. Big data is an epidemic growth and availability of structured and unstructured data, which makes it extremely important for organizations to manage well for their advantage.

Since more data means realistic information and more information translates into accurate analysis, big data as a base for more accurate analysis, has become as vital as the internet for people, businesses and society at large.

5 Reasons it has become really important to Move to Big DataA buzz has surrounded big data, and IT executives have begun to understand the advantages of big data. Universities and colleges are producing wonderful data analysts and the career as a whole has taken a real boost.

Precise decision making as the final outcome, big data applications and software ensure increased operational efficiency, cost reduction and risk reduction for organizations.

Companies and organizations of all sizes have begun to reap the benefits of data analytics. If you haven’t been part of this new change and are still figuring out how wise it would be for you to venture into big data analysis, you have arrived at the right place. We have for your understanding listed 5 reasons it’s become really important to Move to Big Data, if you haven’t already.

Better data management

Data is the most significant parameter for decision making and cost effectiveness for any organization. Many organizations have understood this and have hired data scientists, to measure, map and process the humongous data they receive to understand the past and the future.

Since it takes a significant technological know-how to really analyze and process all this data, just-another-analyst can do little.

Growing demand for professional big data experts in all parts of the world has made the career more lucrative and benefiting.

Utilize cloud storage and benefit

Organizations with large data influx can use third party cloud storage services. These would provide substantial storage for data and allow necessary computing for a specific duration of time.

The 2 chief advantages of using cloud service are –

– – It lets organizations analyze and process big data without having to invest too much in their IT infrastructure.

– –  Internal IT experts of the organization can draw the required expertise to deal with gigantic chunks of data.

Easier understanding of data for clients

If  an organization can understand its past and future with the huge data it processes or stores, it can deliver a great visual representation of its data for the prospective clients to see in form of easy charts and graphs.

This data visualization can facilitate users who can perceive what-is-what and therefore, make quick and calculative decisions.

Opening new opportunities for yourself

With the maturity of data analytics tools, more and more organizations and people are realizing the true and competitive advantage of using big data.

Being thoroughly familiar with preferences, likes and dislikes of the customers, business can venture into what is more rewarding.

Helps in evolution

Data is not only limited to numbers; data today is available in various forms including videos, audios, text and graphics. All this kind of data provides a lot of valuable insight which allow the organizations and people to use the right tools to understand specific patterns of data in a particular way and evolve their businesses in predefined and well regulated way.

The real problem of big data isn’t that there is abundance of data; it is perhaps that we cannot figure out so easily as to how this exponentially big data should really  be used. Maybe the aforementioned point have made a difference.

How big data influences the stock market?

How big data influences the stock market?

In the real world scenario, if the market manipulators are removed from the question, buying stock becomes a worthy accusation. By acquiring stock, you place high stake on how the company functions, since the market in itself is a true determinant of the value of the company you’re investing in. But, what if the data, on which the decisions are to made, increases manifold. Read on to know how this instantaneous rise in data influences stock markets.

Due to the excessive dependability and usage of mobile devices, new media and social media, there has been a massive increase in the amount of data being generated and processed. This rising data quality is a result of spontaneous shift of digital occupancy from office to online.

Big Data and Stock Markets

Over the years, the rise in data (dubbed as big data) has been almost a billion times than it was before, say the rise of the mobile phenomenon. Since, rise in data trading volumes have increased a billion times which has brought about a influx in trade transaction by a similar number.

Big data

Big data is humongous data sets which are virtually impossible for conventional technology to process. Big data offers great advantage to companies, only and only if they can have the IT infrastructure and the manual know-how to manage and process this data.

Elements of big data:

  • Over 90 percent of the data in the world has been created in the past two odd years
  • It is estimated that in comparison to 2009, data production will be about 44 times more by 2020
  • The enormous data increase is fracturing IT infrastructure of the stock market

The Stock Market

In the stock market, people buy stocks – one, because they like the company and two, they want to be a part of its growth phenomenon. But, it has been seen that there is already a very limited inclusion of real human wisdom involved in the stock market.

We mean that the number of transactions being transacted on the market by real humans is really a very small fraction of the daily trading volumes. This means the real money game of buying and selling stocks by the most serious traders are being enacted by automated systems that do the most gruesome work. In such hue and cry, where my algorithm on the market is better than yours, the entire stock market has become a robot war and has increased the amount of processable data to an extent which has become difficult for market’s IT infrastructure to handle.

In the stock market, each trade creates a ripple effect, the size increases with the size of trade. When trading happens at speeds beyond control, the ripple effect can confuse other machines, which leads to an influx in buying and selling of stocks until profit making is maximize. Even though the market is influenced by the ripple effect, current stock holders make all efforts to keep the system operational.

However, due to speed, scale and volume of the kind of data being created on the stock market, the problem of mapping the data has been growing manifold.

Big data if not managed properly in itself can be dangerous for the market, and coupled with speed, scale and volume (so large) the effect will surely be something new for the market. With the growing speed the data is making the human touch extremely insignificant in the market. The market is becoming a robotic battlefield, where large IT infrastructure and supercomputer are going to play the money game.

All Are Valuable Members of Hadoop Community says Cloudera CEO

All Are Valuable Members of Hadoop Community says Cloudera CEO

Within three months of his taking over the leadership of the company, CEO Cloudera Tom Reilly has already visualized where the company is headed.

Cloudera CEO Tom ReillyAccording to him, one needs to have a strong and far-sighted vision for the company if it is to compete against the likes of Hortonworks and MapR for a share of the pie in the highly evolving Hadoop market.

Despite the tough competition, Reilly remains a well wisher for his rivals, whom he views as valuable members of the Hardoop community. His message to his employees is also the same – consider all your competitors as valuable contributors for the success of the community.

Interestingly, Reilly credits Hortonworks, one of the fellow startups and rivals, for driving the development of YARN, which has provided the much needed impetus to every major player in Hardoop.

He also affirmed that the real competition his company faces is from information giants such as Pivotal and IBM, and not other start up rivals.

Cloudera’s CEO was a little shy in sharing details about the change of focus of his company. He has kept the details safe for a public announcement during the Hadoop World conference to be held next week.

Nevertheless, industry watchers estimate that Reilly’s plans for Cloudera are bigger than before. He doesn’t want the company to become just another Hadoop distribution company. With an ever growing list of features and over 700 partners, he aims to make it a data giant that delivers real value to enterprises.

When confronted on the question of Hortonworks luring away Spotify from Cloudera, Reilly has altogether a different take. He confesses that the development surely hurt from a public relations perspective, but it’s not something that will pull the company’s shoulders down.

All Are Valuable Members of Hadoop Community says Cloudera CEO

He explained that Spotify wanted a comprehensive enterprise support and was no longer interested in making use of the free version of Cloudera’s software. Along with Hortonworks, his company too listed a price for the deal. However, Hortonworks managed to put up a slightly better contract and Cloudera didn’t try to match it intentionally, claiming it didn’t make good business sense for them.

Reilly appears too outlandish when he states that the deal didn’t matter much to them and Spotify only managed to earn a low priced vendor with this contract. But deep within, Reilly knows that in order to make his company ride towards profitability, he needs to turn out better than his competitors.

Experts believe that even though Cloudera has a lot on its platter, the 800 pound Hardoop startup can’t distance itself from the present competition.

Unless the company takes a big leap to stand in line with information giants, it will have to live with the image of a Hardoop startup.