decor
 

Planet Big Data logo

Planet Big Data is an aggregator of blogs about big data, Hadoop, and related topics. We include posts by bloggers worldwide. Email us to have your blog included.

 

September 26, 2017

Ronald van Loon

For AI to Change Business, It Needs to Be Fueled with Quality Data

There’s no doubt that AI has usurped big data as the enterprise technology industry’s favorite new buzzword. After all, it’s on Gartner’s 2017 Hype Cycle for emerging technologies, for a reason.

While progress was slow during the first few decades, AI advancement has rapidly accelerated during the last decade. Some people say AI will augment humans and maybe even make us immortal; other pessimistic individuals say AI will lead to conflict and may even automate our society out of jobs. Despite the differences in opinion, the fact is, only a few people can identify what AI really is. Today, we are surrounded by minute forms of AI, like the voice assistants that we all hold in our smart phones, without us knowing or perceiving the efficiency of the service. From Siri to self-driving cars, a lot of promise has already been shown by AI and the benefits it can bring to our economy, personal lives and society at large. The question now turns to how enterprises will benefit from AI. But, before companies or people can obtain the numerous improvements AI promises to deliver, they must first start with good quality, clean data. Having accurate, cleansed and verified information is critical to the success of AI. The data that fuels AI-driven applications must be trusted, on time and of the highest quality.

Data Quality and Intelligence Must Go Hand-in-Hand

Data is currently used by organizations to extract numerous informational assets that are then used to assist strategic plans. The strategic plans dictate the future of the organization and how it fairs within the rising competition. Considering the importance of data, the impact that can be caused by low quality information is indeed intimidating to think of. In fact, bad data costs the US about 3 trillion per year.

Recently, I had the opportunity to interview Nicholas Piette and Jean-Michel Franco from Talend, which is one of the leading big data and cloud integration company. Nicholas Piette, who is the Chief Evangelist at Talend, has been working with integration companies for nine years now and has been part of Talend for over a year.

When asked about the link between both Data Quality and Artificial Intelligence, Nick Piette responded with authority that you cannot do one without the other. Both data quality and AI walk hand-in-hand, and it’s imperative for data quality to be present for AI to be not only accurate, but impactful.

To better understand the concept of data quality and how it has an impact on AI, Nick used the help of the five R’s method that he mentioned was taught to him by David Shrier, his professor in MIT. The five R’s mentioned by Nicholas include:

  1. Relevancy
  2. Recency
  3. Range
  4. Robustness
  5. Reliability

If the data you are using to fuel your AI driven initiatives ticks off each one of these R’s, then you are off to the right start. All five of these hold a particular importance, but relevancy rises above the rest. Whatever data you have should be relevant to what you do, and should serve as a guide and not as a deterrent.

We might reach a point where the large influx of data we have at our fingertips is too overwhelming for us to realize what elements of it are really useful vs what is disposable. This is where the concept of data readiness enters the fold. Having mountains of historical data can be helpful for extracting patterns and forecasting cyclical behavior or re-engineering processes that lead to undesirable outcomes. However, as businesses continue to advance toward the increase use of real-time engines and applications, the importance of data readiness—or information that is the most readily or recently made available—takes on greater importance. The data that you apply should be recent and should have figures that replicate reality.

AI Use Cases: Once You Know the Rules, How do You Play the Game?

When asked for the best examples of the use of AI at work today, Nick said he considered the use of AI in healthcare as a shining example of both what has be achieved using AI to-date and what more companies can do with this technology. More specifically, Nick said:

“Today, healthcare professionals are using AI technology to determine the chances of a heart attack in an individual, or predict cardiac diseases. AI is now ready to assist doctors and help them diagnose patients in ways they were unable to do before.”

All accolades aside, the use of AI in healthcare is also currently dictated by our understanding or interpretation of what the AI algorithms produce. Thus, if an AI system comes up with new insights that seem ‘foreign’ to our current understanding, it’s often difficult for the end-user to ‘trust’ that analysis. According to Nick, the only way society can truly trust and comprehend the results delivered by AI algorithms is if we know that at the very core of those analyses is quality data.

Nicholas Piette added that ensure data quality is an absolutely necessary prerequisite for all companies looking to implement AI. He said the following words in this regard:

“100% of AI projects are subject to fail if there are no solid efforts beforehand to improve the quality of the data being used to fuel the applications. Making no effort to ensure the data you are using, is absolutely accurate and trusted—in my opinion—is indicative of unclear objectives regarding what AI is expected to answer or do. I understand it can be difficult to acknowledge, but if data quality mandates aren’t addressed up front, by the time the mistake is realized, a lot of damage has already been done. So make sure it’s forefront.”

Nick also pointed out that hearing they have a data problem is not easy for organizations to digest. Adding a light touch of humor, he said “Telling a company it has a data problem is like telling someone they have an ugly child.” But the only way to solve a problem is to first realize you have one and be willing to put in the time needed to fix it.

Referring to the inability of the companies to realize that they have a problem, Nicholas pointed out that more than half of the companies that he has worked with did not believe that they have a data problem until the problem was pointed out. Once it was pointed out, they had the AHA! Moment.

Nick Piette further voiced his opinion that it would be great if AI could, in the future, exactly tell how it reached an answer and the computations that went into reaching that conclusion. Until that happens, both data quality and AI are interlinked together, and there is no way you could achieve success in AI without getting complete accuracy in the data that you feed into the machine.

 “If you want to be successful, you have to spend more time working on the data and less time working on the AI.”

Nicholas Piette (Talend)

If you want to learn more about the concept of data quality you can click here.

About the Author

Ronald van Loon is an Advisory Board Member and Big Data & Analytics course advisor for Simplilearn. He contributes his expertise towards the rapid growth of Simplilearn’s popular Big Data & Analytics category.

If you would like to read more from Ronald van Loon on the possibilities of Big Data and the Internet of Things (IoT), please click “Follow” and connect on LinkedIn and Twitter.

Ronald

Ronald helps data driven companies generating business value with best of breed solutions and a hands-on approach. He has been recognized as one of the top 10 global influencers by DataConomy for predictive analytics, and by Klout for Data Science, Big Data, Business Intelligence and Data Mining and is guest author on leading Big Data sites, is speaker/chairman/panel member on national and international webinars and events and runs a successful series of webinar on Big Data and on Digital Transformation. He has been active in the data (process) management domain for more than 18 years, has founded multiple companies and is now director at a Data Consultancy company, leader in Big Data & data process management solutions. Broad interest in big data, data science, predictive analytics, business intelligence, customer experience and data mining. Feel free to connect on Twitter or LinkedIn to stay up to date on success stories.

More Posts - Website

Follow Me:
TwitterLinkedIn

Author information

Ronald helps data driven companies generating business value with best of breed solutions and a hands-on approach. He has been recognized as one of the top 10 global influencers by DataConomy for predictive analytics, and by Klout for Data Science, Big Data, Business Intelligence and Data Mining and is guest author on leading Big Data sites, is speaker/chairman/panel member on national and international webinars and events and runs a successful series of webinar on Big Data and on Digital Transformation. He has been active in the data (process) management domain for more than 18 years, has founded multiple companies and is now director at a Data Consultancy company, leader in Big Data & data process management solutions. Broad interest in big data, data science, predictive analytics, business intelligence, customer experience and data mining. Feel free to connect on Twitter or LinkedIn to stay up to date on success stories.

The post For AI to Change Business, It Needs to Be Fueled with Quality Data appeared first on Ronald van Loons.

 

September 25, 2017


Revolution Analytics

News Roundup from Microsoft Ignite

It's been a big day for the team here at Microsoft, with a flurry of announcements from the Ignite conference in Orlando. We'll provide more in-depth details in the coming days and weeks, but for now...

...
 

September 24, 2017


Forrester Blogs

Hybrid By Design Vs. Hybrid By Accident

As a veteran of enterprise IT, there’s a difference between “hybrid by design” and “hybrid by accident.” Let’s be frank: you are probably doing hybrid by accident – just about everybody is. Hybrid by...

...

Simplified Analytics

How the Digital has changed the way we communicate

Digital Transformation has impacted every business on earth and all aspects of life today.  Communication is the most impacted area of our life. Way back in 1997, most of the communication was...

...
 

September 22, 2017


Revolution Analytics

Because it's Friday: Blue skies or SkyNet?

I enjoyed attending the O'Reilly AI conference in San Francisco this week. There were many thought-provoking talks, but in the days since then my thoughts kept returning to one thing: incentives. One...

...

Revolution Analytics

Tutorial: Launch a Spark and R cluster with HDInsight

If you'd like to get started using R with Spark, you'll need to set up a Spark cluster and install R and all the other necessary software on the nodes. A really easy way to achieve that is to launch...

...

Forrester Blogs

Smartphones are dead. Long live smartphones!

This week, Google announced the acquisition of key HTC assets. This will give them some of the hardware technology expertise, the design skills and the experience in smartphone retail distribution...

...
InData Labs

What companies need to understand to benefit from AI technology

Although businesses are actively looking for ways to put AI technology to work and improve business results, many of them are still hesitant when it comes to undertaking necessary measures associated with certain corporate changes. Here are the tips that might help overcome those doubts and evaluate your company’s readiness to adopt the AI technology....

Запись What companies need to understand to benefit from AI technology впервые появилась InData Labs.

Principa

Right-Party-Connect Rates in 2018

The importance of the collections cascade effect?

 

September 21, 2017


Revolution Analytics

Pirating Pirate Data for Pirate Day

This past Tuesday was Talk Like A Pirate Date, the unofficial holiday of R (aRRR!) users worldwide. In recognition of the day, Bob Rudis used R to create this map of worldwide piracy incidents from...

...
Silicon Valley Data Science

Creating a Digital Strategy

Editor’s note: Welcome to Throwback Thursdays! Every third Thursday of the month, we feature a classic post from the earlier days of our company, gently updated as appropriate. We still find them helpful, and we think you will, too! In this case, the original version of this post was under the title “Optimizing Your Digital Strategy.” That original version can be found here.

We know that digitization is a disruptive force that can help your company stand out from the competition. It needs to be a priority, but where should you start? Let the key aspirations that define the vision for your company be the cornerstones of your approach towards digitization. This ensures that the areas in your digitization plan are the right ones to focus on.

Digitization Mega-Trend

Virtually every company in every industry is waking up to the mega-trend of digitization. All the traditional “analog” business processes that have been performed by people moving paper around are fast becoming “digital” in both how they are performed and what inputs and outputs are used in the process. Further, the solid walls of the traditional company are becoming virtual in that there is increasing digital connectedness with suppliers, customers, partners, distribution channels, etc and the data being generated by these. One of the early manifestations of this digitization trend was e-commerce, which disrupted retail by leveraging internet marketing, online transaction processing, electronic fund transfers, electronic data interchange, etc. to enhance buying and selling. McKinsey estimates “E-commerce is growing at double-digit rates in the United States and most European countries and it is booming across Asia.” Digitization has expanded well beyond retail. What this means is more information is being converted into a digital format, more software is being used to automate business processes, more data is being generated and consumed by these processes, and more interconnection of products, value chains, and business models is occurring.

IDC estimates that about two-thirds of the digital universe is created by consumers (i.e., generated by people versus machines), but enterprises are responsible for 85% of this, in that they are causing the generation and capture of this data. In addition, the Internet of Things (IoT) is also exploding (i.e, data generated by machines). Instrumentation is being added to virtually everything, from workout clothes to baby socks and dog collars, to thermostats, and cars for monitoring driving behaviors, etc. The natural extension of this is to connect this machine generated data, via the internet, to applications (mobile or web-based) and embed analytics to enable better management of assets and resources, or to enable the asset to control its surroundings, or to extend the asset with digital services, for example.

The potential for digitization to be a game changer is inherent in its capability to generate new business models. McKinsey states, “Digitization is fundamentally altering the nature of competition”. However, the reality of turning digital priorities into new ways of doing business is not as intuitive as it may seem. It impacts virtually every function and line of business—the ways they are organized, how they get things done, the technology used to perform the activities, and the metrics that the business uses to measures itself. Going digital is not simply an IT discussion and requires more than rough ideas from the lines of business on the areas of focus and scope of the effort.

Be Data Driven

Digitization means data. Lots of data in all its volume, variety, velocity and veracity. Big Data. Bain contends that, “Big Data isn’t just one more technology initiative. In fact, it isn’t a technology initiative at all; it’s a business program that requires technical savvy.” It is not as simple as adding more IT support and storage capacity and assuming that the business will spontaneously start generating data-driven insights to make smarter business decisions.

Most companies recognize the value of being data driven, but struggle because their data is all over the place. Data can be persisted in traditional data warehouses, new HDFS storage locations, public or private clouds, etc. Companies lack visibility into their data repositories. How is data actually being used? What data is missing? What data is needed? Often this is due to a siloed view of data and corresponding data management which may be optimal for a certain function in the organization but suboptimal for those strategic and foundational processes that underpin the business model and cut across all key functional areas, connecting to suppliers and customers. Tribal knowledge about data sources or systems of record are not easily translated to others and may leave with the subject matter expert when they walk out the door. Some of this knowledge is implicit in the information systems that facilitate the flow of data within legacy systems or applications. Some is obscured in code that hasn’t been looked at in a long time—and still assumed to be relevant to the business, along with the underlying assumptions made when the systems were designed. Sometimes, these systems are sunsetted due to outsourcing, particularly for support functions that are not strategically vital to the business. While there is value in doing this, it also creates its own set of challenges, and may impact the data available for insight discovery. Finally, there are new sources of data that are not traditionally in play but potentially useful.

While there has been plenty of debate over centralized versus distributed enterprise data warehouse architecture strategies, there is much more at play now. Data may not be physically centralized but can be managed as an integrated whole, with strategic imperatives and business objectives being the driving force. Companies need big picture thinking about data. While Hadoop distributed file systems are affordable and support open-source tool sets for exploring, analyzing, and gaining insight from all this data, just because you’ve waded into the data lake with Hadoop, doesn’t automatically follow that the business value will come. What is needed is a roadmap for business success.

Develop a Data Strategy

In order to ensure that the areas in your digitization plan are the right ones to focus on, you need to let the key aspirations that define the vision for your company be the cornerstones of your approach towards digitization. These aspirations can be articulated as concrete business objectives that are a tangible means to achieving your aspirations. For example, aspiring to grow revenue means being preoccupied with your customers and making sure they are taken care of, advocating for you and loyal to you. You want to improve the customer experience across all your channels and make every interaction with the customer a win-win. Your customers get to the answers or actions they need quickly; you have an opportunity to learn more about them or educate them about things that are genuinely helpful to them. Often this requires the corresponding back-office functions to be quick, seamlessly integrated, and flawless. Getting very specific about how to accomplish these things will enable your imperatives to be successful.

Once the business objectives are concrete, the next step is to translate these into business/technical Use Cases. Use Cases are best described as distinct analytics life cycles—the end-to-end value chain which articulates what efforts are required to get from raw data to insight discovery for real decisions that enable each of your business objectives. The data value chain can be described for each of your business objectives. For example, in the case of growing your revenue, let’s say one business objective is “to develop and execute a new targeted and personalized marketing campaign” in order to attract prospects. This campaign is to be delivered via a mobile app to prospects that are unhappy with their current product, are likely to respond to your invitation or offer, and meet your criteria as a desirable customer. The data value chain encompasses the high-level activities of data ingestion, processing, persistence, integration, analytics, and exposure (making the insights operational). The value chain for this scenario could be:

  • Ingestion: social media, internal customer profile data, historic campaign response data.
  • Processing: text analytics, data normalization, transformation, manipulation, matching, enriching, etc.
  • Persistence: storing enriched data
  • Integration: joining, matching multiple data sources
  • Analytics: training predictive models for campaign response likelihood, extracting key phrases, sentiments using machine learning algorithms for personalizing offers
  • Exposure: enablement of campaign management application with necessary inputs and scores

After preparing the Use Cases for each business objective, you can start to see patterns of technical workloads that touch multiple business objectives. The business can take this one step further and prioritize these workloads based on their impact potential (i.e., alignment to business objectives and strategic imperatives) and their ease of implementation (i.e., low hanging fruit).

More information about developing a data strategy can be found here. This method is specifically designed to uncover the technology solutions needed to fulfill the ambitions of the business, guided by business priorities, and highlighting the most impactful areas for immediate investment and technology development. In summary, the key components of this method include:

Data strategy checklist

Armed with a Data Strategy to support your digitization mandate, it becomes much easier to prioritize the development of new skills and the investments necessary to transform your business into one that is competitive in the digital marketplace.

The post Creating a Digital Strategy appeared first on Silicon Valley Data Science.


Revolution Analytics

Recap: Applications of R at EARL London 2017

The fourth EARL London conference took place last week, and once again it was an enjoyable and informative showcase of practical applications of R. Kudos to the team from Mango for hosting a great...

...
Ronald van Loon

Digital Innovation Starts with a Digital Core

A lot of times when the prevalent industry trends are discussed among industry folks, there are usually two directions in which the conversation goes. It is either varying states of disbelief at the rate of change within the business and IT landscapes; or it is enthusiastic agreement on the importance of moving with the times and adopting a digital infrastructure. The former is born of a dedication to supposedly “tried and tested” methods, which arguably are worth next to nothing in this day and age. And the latter is what more and more business leaders and executives need to be doing at the moment.

Relevance of Digital Transformation

The economy, for the most part, has begun to undergo a massive change in the way the entire infrastructure and various other components function. This is due to the onset of the digital age, which has made the idea of a “traditional” economy effectively obsolete; and in good time too, since the relevance of digital transformation is growing by leaps and bounds each day, with the completion of that transformation being predicted to be around 2020, according to some industry experts.

It is estimated that a whopping 212 billion (yes, Billion!) sources will be in connection with each other by the time the worldwide digital transformation is complete. To keep up with connectivity at this scale, the business of today has to radically transform itself into one that is not only connected to its customers via cloud based networks, but also internally linked through a similar network. But more than that, it needs a digital core, if it is to compete in the tough marketplace of today.

How a Digital Core Impacts a Business

In a recent presentation in London on the importance of innovation in the current business landscape, Martin Frimodt, a Director at ATP and SAP Finance Head, spoke about how the ATP next generation business suite has revolutionary characteristics, and can potentially improve the way a business conducts its processes, both internally as well as externally. A next generation business suite is an ideal entry into the broader landscape, for the following reasons.

  • It promoted and enables the development of a digital core.
  • Since digital transformation requires a digital core to be in place and functioning, a competent business suite can work wonders for today’s businesses.
  • A seamless transition onto the digital plane is the Holy Grail for businesses. This can be facilitated via the use of advanced business suites.
  • If “cost-effective” is the buzz term of consideration for a business, a digital business suite, as explained by Martin Frimodt, can be immensely advantageous, since it is a lot cheaper to shift to such a system, as compared to transitions onto other systems.

In his address, Martin Frimodt discussed how his company chose to adopt the new suite now rather than in the future, in order to have a stable and functioning array of financial services. At its base, according to him, was the IT team, who not only constantly analyzed the existing system, but also generated reports of how smooth the transition would be. In a way, IT was integral to the transformation and transition onto a new suite.

The process is currently underway at Frimodt’s company, and so far it has been easy sailing according to Martin Frimodt. This shows how a company that aims to usher itself and others along with it, into the new, efficient digital age, can do so without facing an overwhelming number of hiccups. Even when the task is as seemingly outlandish as bringing the world of IT into the world of finance, it can be done with the help of a good business suite.

Importance of Cost-Efficiency in Migration

For a small enterprise, operating on a local scale, the cost of migration can be very low. This is because the amount of data which needs to be transferred and the number of processes that need to be acclimatized to the new system may be proportionately less. However, this is not the case for a company that operates on a much larger scale, especially a company that is involved in people management and coordination on some level.

A company such as that requires a cost-effective solution to migrate onto a new, innovative system, while maintaining the integrity of the data. When it comes to saving cost, the digital core can actually prove to be beneficial due to the process efficiency.

Why Innovation is so Important at this Stage

Innovation is perhaps the single most important area that companies and businesses need to strive towards. This is not only because of the immense competition that is undoubtedly going to take over in the future between companies that wish to gain an edge, but also because of the future which is going to be digital, on an essential level.

According to Martin Frimodt, there is a sense of urgency in the business landscape; and urgency to innovate and move past the competition. If one were to compile all the potential advantages that come with innovating and transferring internal systems onto a better, more efficient platform; it becomes very easy to see why that urgency is so prevalent.

Reasons to Innovate

Following are some of the points which Martin Frimodt put forward; along with why they make for a need to innovate.

  • According to Frimodt, “if you do not innovate, you will soon be out of business.” This quote rings true on many levels, partly due to its plethora of practical examples that have emerged over the years, such as Blockbuster, MySpace, and Excite, and partly due to the obvious movement ahead of technology and the companies which adopt new technology at the earliest.
  • According to Frimodt, “Young companies need to teach the old companies how to innovate.” The reasoning behind the quote is well founded, since the majority of young companies over the years have been responsible for some of the most tremendous innovations in existence. On the other hand, older companies fail to innovate and improve their business models and systems to accommodate new growth. Older companies need to learn the advantages of transforming their infrastructure into one that accommodates growth, if they are to succeed in modern times.
  • “Best way to survive is to move away from uncertainty.” Uncertainty in the finance industry is nothing new, considering the global economy, which has been turbulent to say the least. To get a leg up here, businesses need a more streamlined infrastructure with much more stable, time-saving, and efficient processes being enabled across the board.
  • “Innovation is like giving people a computer with the internet in 1995. You cannot explain what is possible but they explore by using it.” This is certainly true, since not all businesses are likely to adjust to completely new business suites from day one. It is a gradual and thoughtful process, however, if it is done right, there is no limit to the business growth that can take place as a result

If you would like to read more from Ronald van Loon on the possibilities of a digital core, please click “Follow” and connect with him on LinkedIn and Twitter.

 

 

 

 

 

Ronald

Ronald helps data driven companies generating business value with best of breed solutions and a hands-on approach. He has been recognized as one of the top 10 global influencers by DataConomy for predictive analytics, and by Klout for Data Science, Big Data, Business Intelligence and Data Mining and is guest author on leading Big Data sites, is speaker/chairman/panel member on national and international webinars and events and runs a successful series of webinar on Big Data and on Digital Transformation. He has been active in the data (process) management domain for more than 18 years, has founded multiple companies and is now director at a Data Consultancy company, leader in Big Data & data process management solutions. Broad interest in big data, data science, predictive analytics, business intelligence, customer experience and data mining. Feel free to connect on Twitter or LinkedIn to stay up to date on success stories.

More Posts - Website

Follow Me:
TwitterLinkedIn

Author information

Ronald helps data driven companies generating business value with best of breed solutions and a hands-on approach. He has been recognized as one of the top 10 global influencers by DataConomy for predictive analytics, and by Klout for Data Science, Big Data, Business Intelligence and Data Mining and is guest author on leading Big Data sites, is speaker/chairman/panel member on national and international webinars and events and runs a successful series of webinar on Big Data and on Digital Transformation. He has been active in the data (process) management domain for more than 18 years, has founded multiple companies and is now director at a Data Consultancy company, leader in Big Data & data process management solutions. Broad interest in big data, data science, predictive analytics, business intelligence, customer experience and data mining. Feel free to connect on Twitter or LinkedIn to stay up to date on success stories.

The post Digital Innovation Starts with a Digital Core appeared first on Ronald van Loons.


Forrester Blogs

The 2017 Enterprise Architecture Awards

This Year’s Winning EA Initiatives, Brought To You By Forrester And Infoworld, Drive The Customer-Obsessed Digital Business Any enterprise architects worth their salt can tell you how strategic...

...
 

September 20, 2017


BrightPlanet

Learn About OSINT and Security Risk Management at ASIS 2017

At BrightPlanet, we love sharing our knowledge with others about open source intelligence (OSINT), security risk management, data harvesting, and other topics related to the Deep Web and Dark Web. Are you in the security risk management business? Hear from Tyson Johnson, BrightPlanet’s Vice President of Business Development, as he shares his knowledge regarding security […] The post Learn About OSINT and Security Risk Management at ASIS 2017 appeared first on BrightPlanet.

Read more »

Forrester Blogs

The Data Digest: eCommerce May Be The Only Way To Challenge The Google/Facebook Duopoly

As will be no surprise to advertising industry watchers, the duopoly of Google and Facebook once again demonstrated its dominance of the ad market following the release of their Q2 2017 earnings...

...

Revolution Analytics

Preview: ALTREP promises to bring major performance improvements to R

Changes are coming to the internals of the R engine which promise to improve performance and reduce memory use, with dramatic impacts in some circumstances. The changes were first proposed by Gabe...

...

Forrester Blogs

VR and AR Offer Innovative Solutions For Real B2B Marketing Problems

If you’ve ever encountered virtual or augmented reality in the business world it was likely at an event or tradeshow where a company had an AR/VR demo at their booth to draw in “traffic.” While this...

...
Ronald van Loon

How Leading Organizations are Leveraging Big Data and Analytic

“Data will talk to you if you’re willing to listen”— Jim Bergeson.

Few can dispute that.

However, the challenge comes when data transforms into bundles and stacks of unorganized and unstructured data sets. The challenge comes with listening to big data and making sense of what it says.

With big data, the conversing data becomes loud and noisy. You don’t hear the voice; you hear the cacophony. This is where organizations struggle.

And, amidst a struggle, you look up to the leaders to see how they are rising to the challenge. You observe, you learn, you implement and you adapt.

This is the first article of my “Under the Spotlight” series, where we will look at how leading organizations are leveraging big data and analytics, filtering out white noise from the cacophony in the process—to closely follow and benefit from what data has to say.  

These organizations are spaced among different industry verticals, including aerospace industry, sports industry and life sciences industry, along with government agencies.

Airbus Leveraging Big Data and Analytics to Improve Customer Experience

Airbus has been a global leader in the aerospace industry for the last four decades, specializing in designing and manufacturing of aerospace products, services and solution. 

Operating in a complex and highly-competitive industry means that Airbus has to be at its best in terms of efficiency, productivity and innovation to deliver an unmatched service experience to their customers.  Big data and analytics are helping the company in that respect.

Using the IBM InfoSphere Data Explorer, Airbus integrates data discovery, navigation, analysis and contextually-relevant view of more than 4TB of indexed data, that is spread across different business units. All this data is then centrally accessible for people working in the service department, equipping them with valuable information to execute timely airline maintenance programs.

Leonard Lee, the vice president and head of new business models and services at Airbus Group, said in a recent interview,   “We have tons of data. An aircraft is a very talkative machine. It produces petabytes of data.  And today, in general, in aerospace industry, only two percent of that data is used in any constructive way. So, our plan is to leverage all of the richness in that data, to help improve our customer experience by driving initiatives like predictive maintenance. This way our customers can get airplanes back in the air as quickly as possible.”

This one application of big data and analytics has accounted for savings of more than $36 million for the company in a single year.

Another way the company has been leveraging big data and analytics is to improve lead time in the production of aerospace units, so that the customers can be facilitated with deliveries in due time.

Each shop floor has been empowered with digital solutions, which allow workers across different production units to update the status of a project in real time. This data can then be communicated and shared between workers, positioned across different shop floors, reducing paperwork inspections and inducing a proactive production approach. The newest Airbus rotorcraft, the H160 helicopter, has been built on this newly erected production model.

Lee further expanded on the company’s strategy, adding, “What we are trying to do with our digital transformation effort, is to build digitally-enabled, data-driven business models. We are working with strategic partners like Palantir, and others, to capture more value across the value chain, by having layers of analytics, machine learning and artificial intelligence, so that we can build solutions that would help us improve our customers’ experiences.”

NFL Teams Can Now Leverage Big Data and Analytics to Improve Performance Levels

In  April, the NFL Players Association (NFLPA) entered into a partnership with WHOOP, a company that manufactures wearable devices. The objective of the partnership was to equip the athletes with a technology that could help them track their health and performance levels.

The WHOOP device can be strapped on an athlete’s forearm, wrist or bicep, giving insight into his body while he trains or recovers.

  • It allows coaches and players to know how much sleep they receive and compare it to how much sleep they should be getting.
  • It measures an athlete’s muscle strain levels, which can then be leveraged to reduce muscular injuries and recovery times.
  • It allows coaches to measure the workload of every athlete separately, so that they can design training sessions accordingly.

Bioethicists Katrina Karkazis and Jennifer Fishman, commented on this innovative initiative, in an article, saying that if applied judiciously, responsibly, and ethically, biometric data technologies in professional sport have the potential to reduce injuries, improve performance, and extend athletes’ careers.

Isaiah J. Kacyvenski, a former American football linebacker, welcomed the idea, saying“In the end, playing football is a job for us—athletes. As a football player, I always thought of my body as a business. The ability to create more value for the job you do, should be acknowledged.”

NFLPA has announced that the data and the insight from it would be in the sole ownership of athletes and they could use it or sell it, in any way that they may want. The announcement further entailed the use of the device during a match as prohibitive.

Big Data and Analytics May  Speed Up Finding Cure for Cancer

Business,  sports, and even  the life sciences industry is finding a use for big data. The life sciences industry is all about researching and expanding our understanding of the human body, to keep it healthy and disease-free.

A human body is a complex system of cells, tissues and organs, with various biological molecules forming the fabric of this complex system. This  system is then regulated by sets of genes, which are present in our DNA. 

To put these details and complexity into a quantitative context:

  • There are 37.2 trillion cells in our body.
  • Each cell is made up of 7 billion atoms.
  • There are around 20,000 genes in a single cell.

Expand on each of these details, and you will come across petabytes of data.

This shows the extensive amount of data, which life scientists have to manage and decipher on regular basis.

But, the industry is up for the challenge. It believes big data and analytics can help to speed up the process of finding cure for various diseases, even something as complex as a cancer.

“By leveraging big data and analytics, we can begin understanding the basic facts about how tumors grow, how heterogenous tumors are and what are the targets, so we can create new drugs that work for particular tumors with particular genomic signatures,” said Robert Grossman, the principal investigator of the project Genomic Data Commons, in an interview with Chicago Inno

 

What is Genomic Data Commons all about?

The project, Genomic Data Commons, is about making cancer data available to researchers worldwide, so that they can contribute to the findings and help speed up the search for cancer treatment. The data repository is housed at University of Chicago and is one of the largest open access repositories in the world.

“On the research side, the majority of researchers in cancer, I think, find the amount of data frustrating,” said Grossman. “They want to use all available data but to set up an environment, to manage it, keep it secure and compliant—the process is just overwhelming. Our role is to bring together the large public research data sets to consistently analyze them and make it available in a digestible form to the research community to accelerate the pace of research.”

The project was launched a year ago and the team believes that over the next six to nine months, they would be well resourced to make announcements regarding discoveries made through the use of GDC.  

Government Agencies Leveraging Big Data and Analytics to Ensure Safety of Citizens

One of the primary roles of government agencies is to collaborate and communicate with each other to ensure the safety and wellbeing of citizens.  

U.S. government agencies, both at federal and state level, have always worked hard to make sure that they deliver on this responsibility. And now, they are leveraging big data and analytics to reinforce their efforts and strategy.

Data and analytics is not a new domain for government executives. However, as the volume of data rises, while budgets get strained, the challenge is to use big data and analytics solutions that give faster and clearer insights for agencies to respond proactively and quickly. An example of one such deployed solution is SPATIOWL by Fujitsu. The platform gathers traffic movement and transportation-related data that comes from sensors installed in urban areas. This data can then be used to identify accident hotspots—areas where there is increased passenger and vehicle movement—so that preventative measures can be taken in advance, to mitigate accident risks.

Another example is the use of big data and analytics to anticipate natural disasters and improve disaster management activities. Government agencies are leveraging the use of technology to acquire high resolution satellite imagery and seismic data. With the help of analysis offered by machine learning and artificial intelligence, this data is then combined with historic information to identify patterns and predict natural disasters. Moreover, platforms are being integrated with disaster predictive algorithms, that allow government agencies to monitor in real time the different delivery channels that make disaster management services accessible. As a result, the standards of delivered services are being improved.

The world of big data and analytics is challenging but insightful. It provides actionable insights to help businesses and organizations automate their process, gain an insight into their target market and optimize their existing operations for improved productivity and efficiency.

But, only if one is willing to embrace its cacophonic nature. And, based on these examples, only few can dispute that.

About the Author

Ronald van Loon is an Advisory Board Member and Big Data & Analytics course advisor for Simplilearn. He contributes his expertise towards the rapid growth of Simplilearn’s popular Big Data & Analytics category.

If you would like to read more from Ronald van Loon on the possibilities of Big Data and the Internet of Things (IoT), please click “Follow” and connect on LinkedIn and Twitter.

Ronald

Ronald helps data driven companies generating business value with best of breed solutions and a hands-on approach. He has been recognized as one of the top 10 global influencers by DataConomy for predictive analytics, and by Klout for Data Science, Big Data, Business Intelligence and Data Mining and is guest author on leading Big Data sites, is speaker/chairman/panel member on national and international webinars and events and runs a successful series of webinar on Big Data and on Digital Transformation. He has been active in the data (process) management domain for more than 18 years, has founded multiple companies and is now director at a Data Consultancy company, leader in Big Data & data process management solutions. Broad interest in big data, data science, predictive analytics, business intelligence, customer experience and data mining. Feel free to connect on Twitter or LinkedIn to stay up to date on success stories.

More Posts - Website

Follow Me:
TwitterLinkedIn

Author information

Ronald helps data driven companies generating business value with best of breed solutions and a hands-on approach. He has been recognized as one of the top 10 global influencers by DataConomy for predictive analytics, and by Klout for Data Science, Big Data, Business Intelligence and Data Mining and is guest author on leading Big Data sites, is speaker/chairman/panel member on national and international webinars and events and runs a successful series of webinar on Big Data and on Digital Transformation. He has been active in the data (process) management domain for more than 18 years, has founded multiple companies and is now director at a Data Consultancy company, leader in Big Data & data process management solutions. Broad interest in big data, data science, predictive analytics, business intelligence, customer experience and data mining. Feel free to connect on Twitter or LinkedIn to stay up to date on success stories.

The post How Leading Organizations are Leveraging Big Data and Analytic appeared first on Ronald van Loons.

 

September 19, 2017


Forrester Blogs

Forrester’s B2B Marketing Forum Reinforces The Concept Of “Customer As Compass” (And Covers What To Do About It)

I hope that nobody in your organization needs convincing that you should join us at the B2B Marketing Forum in Austin, Texas, this year (October 5 and 6, 2017). We have an incredible agenda with a...

...

Forrester Blogs

Thinking Of Hiring A Product Manager? Read This First.

Do you currently have someone in your organization who fits this job description? If not, then it’s time you do. In the past year, we’ve seen a spike in inquiries on the topic of product managers,...

...

Revolution Analytics

Hurricane Irma's rains, visualized with R

The USGS has followed up their visualization of Hurricane Harvey rainfalls with an updated version of the animation, this time showing the rain and flooding from Hurricane Irma in Florida: Another...

...

Forrester Blogs

Forrester Gathers Experts Across Disciplines To Tackle Europe’s Most Pressing Privacy, Security, And Trust Challenges

Fresh off a successful event in Washington, DC last week, we’re gearing up for Forrester’s Privacy & Security Forum Europe in London on 5-6 October. Forrester is gathering experts in...

...

Forrester Blogs

Amazon And Flipkart Are Expected to Record $1.2 Billion To $1.5 Billion In Sales In India’s Coming Festive Season

On October 6, 2014, Flipkart launched Big Billion Day, an event that occurs within India’s festive season — a holiday period that accounts for 40% of the total sales of key brands in India. Now in...

...
 

September 18, 2017


Forrester Blogs

New Forrester Waves Assess Customer Journey Analytics Platforms

Why is journey analytics such a hot topic? Because it can help firms move the needle on customer obsession. That’s why after months of research — including in-depth briefings, demos, and customer...

...
 

September 17, 2017


Simplified Analytics

Digital Transformation in the Fashion Industry

Gone are the days when brand communication was mostly made up of ads that appeared on billboards, in magazines and/or on television. Today, all of this is augmented with Digital revolution. The...

...
 

September 15, 2017


Revolution Analytics

Because it's Friday: Rapid Unscheduled Disassembly

SpaceX has done some amazing work proving the concept of commercial spaceflight services. But that's not to say there haven't been a few bumps along the way, as this "blooper reel" (set to Monty...

...

Revolution Analytics

Microsoft R Open 3.4.1 now available

Microsoft R Open (MRO), Microsoft's enhanced distribution of open source R, has been upgraded to version 3.4.1 and is now available for download for Windows, Mac, and Linux. This update upgrades the...

...
Principa

Things you wanted to know about Mathematical Optimisation, but were afraid to ask

Today we explore some of the frequently asked questions around mathematical optimisation.  For the most part the questions are answered in the context of credit risk.  However mathematical optimisation and operations research in general have many applications

 

September 14, 2017


Revolution Analytics

Working with data frames in SQL Server R Services

Most R users are quite familiar with data frames: the data.frame is the fundamental object type for working with columnar data in R. But for SQL Server users, the data frame is an important concept...

...

Forrester Blogs

The Allure of Small

I’ve always been drawn to small. I suspect it’s because I started out small. Not in the “all-babies-start-out-small-duh-forrester-analyst” way, but in the...

...

Forrester Blogs

Why The Convergence Of Adtech And Martech Matters

As we covered in our recent webinar, the convergence of advertising technology and marketing technology is inevitable — and that’s a good thing for marketers. The union of these two tech worlds is...

...
Jean Francois Puget

Just label data!

Machine Learning and Deep Learning are very promising technologies.  Every week comes with its new hyped successes.  Yet, when it comes to applying machine learning and deep learning many people keep making the same mistakes.  Here is one that is particularly troublesome: people often miss that you need to provide examples to learn from.  They expect systems to learn from raw data without any supervision or feedback. 

I can't blame them, as many proponents of machine learning, deep learning, or artificial intelligence just speak about how systems learned from data, without entering the gory details of what you need to do to get there. 

In order to wake them up, I use a very simple story:

If you give a bike to children they won't be able to ride it.  But if you show them, then they'll learn in a couple of days or less.  Machines are no different from humans: they learn from examples, or from feedback, or both. 

Let me give concrete examples of the issue.

  • I was asked to help sell machine learning to a bank. They asked us to have a look at predicting failures of their payment system.  They had previous attempts with various technologies and various consulting companies, but they all failed.  Training data would be transactions processed in the past.  
    My dialog with the client team went like this:
    - (me) Do we know which transactions failed among all the training data transactions? 
    - Yes
    - Do we have enough data? 
    - Yes, we have millions in our training data set. 
    - Perfect, can I get access to data?
    It took a while to get access because of security concerns, as is often the case in regulated industries like banking, but I finally got access to a file of million transactions.  I looked at it, there were hundreds of features, but I could not find where the target values were (whether a given transaction is considered as a failure or not).  When I asked about it, I was pointed to a second file where the target values where.  True, the values were there, and all I needed to do was to join it with the other data set. 
    There was an issue though: the target file contained only 1,700 transactions or so, and only 8 failures out of these 1,700 transactions.  No wonder no machine learning approach would work: how can you learn from 8 examples only?
    I suggested they spent time labeling other transactions.  Assuming 10 transaction labels per minute, a one man week effort would yield more than 10 times more examples to learn from.
    Yet, the client team refused to do that, and guess what, they still have no success with machine learning.
  • In another client context, I was asked to help sell Watson (our deep learning enabled set of APIs).  The client team was more mature than the one above, and they fully got that they need labelled examples.  Their question to us was: can Watson label examples for us? 
    When you think of it, they were asking the learning system to label examples it would be trained on. This is machine learning upside down: normally you train your system first, then, if you did a good job at training, the system can label data correctly. 
    Our answer was that unfortunately they had to provide the labels themselves, of course.
  • In a more recent example where we were missing labels, the client team was asking me if we could use unsupervised learning like clustering.  Why?  because, as the name indicates, unsupervised learning techniques do not require labels.  But this comes at a price: these methods are not predictive at all.  You cannot use them in isolation to learn how to predict a target.
     

I'll stop here, but this is a recurring pattern with people starting their machine learning journey.  The best advice we can give them is very well expressed in this tweet :

image

I would not have said it better.

 

To people's defense, labeling data can be costly and error prone.  Machine Learning adoption will get easier if it can learn from less data.  There are two active research areas trying to address this shortcoming of machine learning:

  • One is active learning, where only a subset of the data is labelled, and the system auto labels the rest. 
  • A second one is transfer learning: you apply an already trained model to a new domain where you further train it. Good news is that most of the training work was done before, and you need a small number of examples to adapt the model to the new use case.

An alternative way is to look at the example generation process itself.  In some cases it is possible to get labels as a side effect of machine learning.  For instance, if you use machine learning to make product recommendations for an e-commerce site, then each time a visitor buys you get labelled data for free: you know what the visitor was really interested in, it is the products in his basket.  You can then compare to the predictions your ML algorithm made and have it learn from it.  This learning from feedback is at the heard of many successful applications of machine learning, see Machine Learning Algorithm != Learning Machine for more details. 

 

 

 

 


Forrester Blogs

Marketers In Asia Pacific: What Are Your Top Martech Challenges?

Forrester’s 2017 Asia Pacific Martech Challenges Survey is open, and we’re looking for B2C marketers in the region to provide their perspective. If you’re a marketing leader who is involved in...

...
 

September 13, 2017


Forrester Blogs

A Random Walk Through Ikea With James McQuivey

Come with me on what I’m calling a random walk with James McQuivey. This walk is through Ikea. I went there in search of a simple question: What is it about this experience that gives Ikea one...

...

Forrester Blogs

Apple’s 10th Anniversary Of The iPhone – What’s Next?

Apple celebrated the 10th anniversary of the iPhone with both updates to its existing product lineup and the opening of the Steve Jobs Theater in Cupertino, CA on its new campus with the...

...

Forrester Blogs

Metrics That Matter Still Matter in ABM, Because – Why Wouldn’t They?

For my most recently published research, I set out to find or define the metrics that capture account-based marketing (ABM) success. It’s a question I hear from clients very often, and one we often...

...
Ronald van Loon

Equestrian Sports – Where Trots Get Digital and Technology Meets Tradition

Equestrian sports is one of the oldest forms of sports entertainment. Its history can be dated back to the ancient Greek civilization. Since then, the trots have transcended the periodical barriers from time to time, walking stride to stride with the age-specific customs, and entertaining the crowds in the process. And, in this present age, where everything is digitized, the trots have now become digital too.

I recently attended the annual World Equestrian Festival, CHIO Aachen—the Wimbledon of Equestrian Sports—and interviewed Ingrid Klimke, the two times Equestrian Olympic champion; (Mr.) Michael Mronz, the Organizer of the CHIO Aachen, and Mr. Björn Ganzhorn, the Head of SAP Global Sponsorships.

We have heard about the deployment of SAP powered solutions for enterprise management. Now, SAP is also making strides in the world of sports, integrating data and analytics with the traditional thrill and passion to dramatically transform the sports experience for fans, media, athletes, and organizers. Leading this race in the adoption of data and analytics driven solution is the equestrian sports.

CHIO Aachen and Data & Analytics

In 1924 the first horse show took place in the Aachen Soers, which is the venue of the CHIO Aachen until today. Since then, the organizers of the CHIO Aachen has been attracting the fans of Equestrian sports by staging athletic performances involving nowadays more than 500 competitors that take part in the event each year. With a spectating audience of over 360,000 visitors, the event organizers are always on the lookout to offer a better and more engaging experience for the fans every year.

SAP, as a brand, has always aimed at making people’s live easier and better, through data and analytics driven solutions. As such, the focus of the two converge on us—the people.

It was this overlapping intrinsic area of interest that prompted CHIO Aachen and SAP to enter into a partnership, and show how data and analytics can be leveraged to make the operations better in any industry—be it manufacturing, retailing, or SPORTS.

When interviewed, Michael Mronz described the partnership as:

“We are moving into a digital world. Be it social media, online shopping, streaming, searching for information, everything has become digital around us. This digitalization has weaved a sense of ownership, engagement, and personalization in our existence that was not known before. Our followers, the fans of Equestrian sports, are part of this digital community. The tendency for digital transformation of the sport, is a natural expectation, that they harbor. With SAP, being a pioneer brand of making digital transformation possible of our physical world, we saw this as a perfect opportunity to pair with them, and deliver what our fans expect from us.”

All of this means increased engagement. Not just passive following; rather, an active and immersive sports following experience.

Björn Ganzhorn extended the conversation by expressing his views about the partnership:

“Our goal as an organization has always been to bring our brand to life and enable digital transformation for our partners. Google and Facebook have taken customer expectations to a next level. At SAP, we develop engaging solutions to help our partners meet the ever-increasing expectations of their customers. Our partnership with CHIO Aachen and Equestrian sports in general reflects this endeavor that we have embarked upon.”

Exploring into the details further, he said:

“We work with the team at CHIO Aachen to establish what kind of more involvement we can bring to the spectators; how can we make the sports experience more interesting and engaging for them and what can we do to simplify the operations of Equestrian sports. This way, we can help broaden the reach of the sport to a more global audience. SAP helps CHIO Aachen to deliver an authentic experience at the Aachen festival, which allows us the opportunity to show our brand diversification and tell people, if data and analytics can help them, it can help you, no matter whichever vertical you belong to—arts, education or anything else.”

The Manifestation of Efforts

As CHIO Aachen and SAP sat together and worked towards the goal of delivering a better experience for customers—the fans, their efforts manifested into a technological offering that is reshaping the meaning of fan engagement.

Spectator Judging app and Equestrian Fan Quiz.

Through the Spectator Judging app, fans can rank and score the riders in real time, during the dressage and vaulting events at the CHIO Aachen. Spectators are now the judge. The scores are then displayed on the board, alongside the scores of the panel judges, which the players can see.

The Equestrian Fan Quiz leverages gamification to engage fans, big and small, by asking questions about the sports in general and the event in specific. It tests their knowledge, while providing an opportunity to learn more about the sports. App users are scored for every answer that they give, and in return they get badges. Fans see it as a “fun” and a great way to spend time during breaks.

And, that’s not all. The partnership has further evolved, and with it, so have the solutions on offer, with the team bringing the Internet of Things to the world of equestrian sports for the first time.

“When we talk about transforming the sports experience, it’s not just the fans that we have to cater to. We also have to look at the riders and their horses, and think about improving their experience as well.”

Said, Michael Mronz. 

“Through data & analytics, we can provide information to our spectators to enlarge the live experience during the event and to the teams to improve their performance on course.”

Added, Björn Ganzhorn.

This is where Equestrian Analytics app comes in.

Equestrian Analytics is a platform, which, through the sensor-captured-data provide insight on the performance of an athlete. It helps fans follow and understand all of the action as it unfolds and allows the riders to optimize their weak points to improve performance. So the next time, they can perform even better. These sensors are lightweight, and are mounted onto the rider. There is also a camera, mounted on the helmet, that gives a close-up of the visual action to the spectators as they track the rider in real-time through a cross-country course.

To explain the technology and its benefits, there could have been no one better than Ingrid Klimke, the two times Equestrian Olympic champion, and the ambassador of SAP-Equestrian partnership. Her continuous and countless success in the sports for over 20 years, has earned her the honor of being the first female rider to receive the German honorary title of “Riding Master”.

“When the first prototype was made, it was not as refined as the technology that power my performances at present. I looked like a space woman, wearing all those bulky devices.”

 “I have always been open to try and test different things. It allowed me to explore how I could use technology, to better analyze my rides, and also to bring my own experience closer to the fans, while I am on the course. This way, they can know what we are doing there or how the sports look like from a rider’s perspective.”

As for the question how the app helps in performance analysis and assist her in improving her strategy, Ingrid Klimke said:

“The success in any format of Equestrian sports, more so in cross country, is half dependent on the horse and half dependent on the rider. When I sit on the horse, my responsibility revolves around two things. First, I should know my speed, and the distance between the fences. When, I don’t know my speed or the distance, I don’t give my horse the chance to react. The second thing is, I have to choose the right way. You have to consider the angle, the line, because sometimes on a course, there can be a complex of fences. So, you have to approach it on the right angle, on the right way, if you want to make through, without over stressing your horse. When, I have access to this type of information, I can then shape up my strategy and leave its execution on my horse. Previously, when I do not had the technology at my disposal, I did not have the insight. So, even when I thought I was doing fine, in reality, there might have still things to improve. Through this app, I can now analyze my ride and sit with chief trainers before the next performance to optimize my strategy. This way, I can do much better as a rider and perform much better, while also supporting my horse to give him a good ride.”  

The Formula for Success

The success of any partnership depends on some core elements.

Ingrid Klimke says:

“SAP Sports is a wonderful team. We work, collaborate and communicate like friends. I tell the team, what I want. I give them the idea. Based on my needs, they then execute the idea, and work towards making it happen. When they come up with a prototype, I test it and suggest where improvements can be made. 

For instance, to round up my analysis, it is not only key to know the speed, line and nature of the terrain but also how I meet my strategic waypoints – the so called minute-markers – which are critical to reach the optimum time. So I asked the team to integrate the minute marker feature along with the existing analysis which they did instantaneously. This now gives me a more comprehensive view of the accrued time deltas and helps to improve my decision making. It’s also a very nice feature for the fans that follow my course on the app as it shows them whether I will make it within the optimum time.”

Björn Ganzhorn shared the success formula by emphasizing the importance of trust.

“It’s the trust that I believe is the formula for the success of this partnership. If Ingrid doesn’t trust us with the solutions that we come with, we won’t be able to do anything. Same goes for the team at CHIO Aachen. If they do not trust us with what we have to offer, this partnership would never sustain.”

Michael Mronz, added:

“If we are discussing the main elements that have contributed to the success of our partnership with SAP, the design thinking approach from the team is what I would like to add. As a team, we do focus on what we can do to make things better. We talk about the perfect scenarios and then look how technology can help us to achieve that perfect scenario.”

In Conclusion

When you add everything up and consider the details that form part of this partnership, you start realizing that data and analytics, just like in business, has many benefits to offer to the world of sports. Here, the focus is on delivering an improved experience to the fans and athletes. In a business, the focus is on delivering improved customer experience. Here, you can use data and analytics to gain insight on a performance. In an enterprise setting, you can use data and analytics to study shop floor execution.

This is what my enticement for the SAP-Equestrian strategic partnership, allowed me to explore. I hope the information in this article was equally beneficial for you. I would like to express my sincere gratitude to Michael Mronz, Björn Ganzhorn and Ingrid Klimke, for taking out the time, and sharing their knowledge on this great transformation that Equestrian sports is witnessing.

If you want to access the full interview, and learn more about what our experts had to share, please subscribe to the webcast, by clicking here.

About the Author

Ronald van Loon is an Advisory Board Member and Big Data & Analytics course advisor for Simplilearn. His company, provides Big Data & Analytics certification courses, along with other leading certification programs.

If you would like to read more from Ronald van Loon on the possibilities of Big Data and the Internet of Things (IoT), please click “Follow” and connect on LinkedIn and Twitter.

Ronald

Ronald helps data driven companies generating business value with best of breed solutions and a hands-on approach. He has been recognized as one of the top 10 global influencers by DataConomy for predictive analytics, and by Klout for Data Science, Big Data, Business Intelligence and Data Mining and is guest author on leading Big Data sites, is speaker/chairman/panel member on national and international webinars and events and runs a successful series of webinar on Big Data and on Digital Transformation. He has been active in the data (process) management domain for more than 18 years, has founded multiple companies and is now director at a Data Consultancy company, leader in Big Data & data process management solutions. Broad interest in big data, data science, predictive analytics, business intelligence, customer experience and data mining. Feel free to connect on Twitter or LinkedIn to stay up to date on success stories.

More Posts - Website

Follow Me:
TwitterLinkedIn

Author information

Ronald helps data driven companies generating business value with best of breed solutions and a hands-on approach. He has been recognized as one of the top 10 global influencers by DataConomy for predictive analytics, and by Klout for Data Science, Big Data, Business Intelligence and Data Mining and is guest author on leading Big Data sites, is speaker/chairman/panel member on national and international webinars and events and runs a successful series of webinar on Big Data and on Digital Transformation. He has been active in the data (process) management domain for more than 18 years, has founded multiple companies and is now director at a Data Consultancy company, leader in Big Data & data process management solutions. Broad interest in big data, data science, predictive analytics, business intelligence, customer experience and data mining. Feel free to connect on Twitter or LinkedIn to stay up to date on success stories.

The post Equestrian Sports – Where Trots Get Digital and Technology Meets Tradition appeared first on Ronald van Loons.


Revolution Analytics

A new EdX course on building Artificial Intelligence Applications

If you'd like to learn the fundamentals of Artificial Intelligence applications, a new course is now available free on EdX. Introduction to Artificial Intelligence (AI), presented by Microsoft, will...

...
 

September 12, 2017


Forrester Blogs

The Quest For The Holy Grail Of Team Messaging Apps: Finding Your Perfect Solution

We’re seeing a significant challenge today in finding the right enterprise-level team messaging app that’s easy to use, compatible with a work environment, and combines various daily functions so...

...

Forrester Blogs

Continuous Delivery and Release Automation – Critical To DevOps Success

I am seeing more and more inquiries focused on driving business transformation where the transformation depends on software and business technology.  ING is a model for digital transformation, and...

...
 

September 11, 2017


Revolution Analytics

Online textbook on data visualization with the ggplot2 package

A new online textbook, Data Visualization for Social Science, will teach you everything you need to know about creating beautiful and elegant data visualizations using the ggplot2 package and the R...

...
 

September 10, 2017


Simplified Analytics

How machine learning APIs are impacting businesses?

In this Digital age, every organization is trying to apply machine learning and artificial intelligence to their internal and external data to get actionable insights which will help them to be...

...
 

September 09, 2017

Principa

Optimizing your marketing spend

Ever wondered how to calculate the best mix of actions in order to achieve the desired result within budget? You might have a pretty good idea of what mix has worked well in the past, but how much rigour goes into that process? Wouldn’t you like a mathematical approach that eliminates the guesswork?

 

September 08, 2017


Forrester Blogs

Equifax Does More Than Credit Scores

Our reaction to the Equifax breach was similar to what we imagine many people went through. First, we wanted to know if we were affected. Second, what about our spouse and other immediate family...

...

Revolution Analytics

Because it's Friday: Can't find the one

Radiohead as long been one of my favourite bands. The band members were classically trained, so there's always something interesting in the arrangement or melody to appreciate, even for musical...

...
Ronald van Loon

What Skills Do I Need to Become a Data Scientist?

 

Leveraging the use of big data, as an insight-generating engine, has driven the demand for data scientists at enterprise-level, across all industry verticals. Whether it is to refine the process of product development, help improve customer retention, or mine through the data to find new business opportunities—organizations are increasingly relying on the expertize of data scientists to sustain, grow, and outdo their competition.

Consequently, as the demand for data scientists increase, the discipline presents an enticing career path for students and existing professionals. This includes those who are not data scientists but are obsessed with data, which has left them asking:

What skills do I need to become a data scientist?

This article aims to answer this question. We will dive into the technical and non-technical skills that are critical for success in data science.

  • If you are a potential data scientist, you can use the information herein, to carve a successful career for yourself in data science.
  • If you are a data analytics director at an organization, you can leverage the information to train your existing team of data scientists, in order to make them more productive and efficient at their work.

This is an address for all those who love to wrangle and rumble with Big Data.

Technical Skills Required to Become a Data Scientist

Statistical analysis and the know-how of leveraging the power of computing frameworks to mine, process, and present the value out of the unstructured bulk of data is the most important technical skill required to become a data scientist.

This means that you need to be skilled at math, programming and statistics. One way of complying with the prerequisite is to have a resonating academic background.

Data scientists usually have a Ph.D. or Master’s Degree in statistics, computer science or engineering. This gives them a strong foundation to connect with the technical points that form the core of the practice in the field of data science.

There are some schools that now offer specialized programs, tailored to the educational requirements for pursuing a career in data science.

Those who don’t want to opt for this focused-but-extensive approach, can pursue other options. This includes focused Massive Open Online Courses (MOOCs) and boot camps. Some program-offering-options worth exploring are Simplilearn’s Big Data & Analytics certification courses. They can help deepen your understanding of the core subjects that support the practice of a data scientist, while also providing a practical learning approach which you will not find in the confines of the textbook.

Other technical skills required to become a data scientist include:

1) Programming: You need to have the knowledge of programming languages like Python, Perl, C/C++, SQL and Java—with Python being the most common coding language required in data science roles. Programming language helps you to clean, massage and organize an unstructured set of data.

2) Knowledge of SAS and other analytical tools: The knowledge of analytical tools is what will help you extract the valuable insights out of the cleaned, massaged, and organized data set. SAS, Hadoop, Spark, Hive, Pig and R are the most popular data analytical tools that data scientists use. Certifications can further help you to establish your expertise in the use of these analytical tools.

3) Adept at working with unstructured data: When talking about the skill of being able to work with unstructured data, we are specifically emphasizing the ability of a data scientist to understand and manage data that is coming unstructured from different channels. So, if a data scientist is working on a marketing project to help the marketing team provide insightful research, the professional should be well adept at handling social media as well.

Non-Technical Skills Required to Become a Data Scientist

We will now shift our focus towards non-technical skills, that are required to become a data scientist. These skills are part of a candidate’s persona and as such can be difficult to assess simply by looking at educational qualifications, certifications and so on.

They are:

1) A strong business acumen: If a data scientist does not have business acumen and the know-how of the elements that make up a successful business model, all those technical skills cannot be channeled productively. You won’t be able to discern the problems and potential challenges that need solving for the business to sustain and grow. You won’t really be able to help your organization explore new business opportunities.

2) Strong communication skills: You are a data scientist and understand data better than anyone else. However, for you to be successful in your role, and for your organization to benefit from your services, you should be able to successfully communicate your understanding with someone who is a non-technical user of data. You need to have strong communication skills as a data scientist.

3) Great data intuition: This is perhaps one of the most

 

significant non-technical skills that a data scientist needs. Great data intuition means perceiving patterns where none are observable on the surface, and knowing the presence of where the value lies in the unexplored pile of data bits. This makes data scientists more efficient in their works. This is a skill, which comes with experience and boot camps are a great way of polishing it.

Data Scientists – The Unicorns

Shashi Upadhyay, the CEO of Lattice, the provider of AI-enabled big data inference engines, once referred to data scientists as unicorns, calling them “professionals with a diverse skill set that is not commonly found in a single individual.” This explains why data scientists are so valued, and why becoming one is so challenging. But, it is not impossible.

At least for the likes of us, who love wrangling and rumbling with data, nothing is impossible!

About the Author

Ronald van Loon is an Advisory Board Member and Big Data & Analytics course advisor for Simplilearn. His company, provides Big Data & Analytics certification courses, along with other leading certification programs.

If you would like to read more from Ronald van Loon on the possibilities of Big Data and the Internet of Things (IoT), please click “Follow” and connect on LinkedIn and Twitter.

Ronald

Ronald helps data driven companies generating business value with best of breed solutions and a hands-on approach. He has been recognized as one of the top 10 global influencers by DataConomy for predictive analytics, and by Klout for Data Science, Big Data, Business Intelligence and Data Mining and is guest author on leading Big Data sites, is speaker/chairman/panel member on national and international webinars and events and runs a successful series of webinar on Big Data and on Digital Transformation. He has been active in the data (process) management domain for more than 18 years, has founded multiple companies and is now director at a Data Consultancy company, leader in Big Data & data process management solutions. Broad interest in big data, data science, predictive analytics, business intelligence, customer experience and data mining. Feel free to connect on Twitter or LinkedIn to stay up to date on success stories.

More Posts - Website

Follow Me:
TwitterLinkedIn

Author information

Ronald helps data driven companies generating business value with best of breed solutions and a hands-on approach. He has been recognized as one of the top 10 global influencers by DataConomy for predictive analytics, and by Klout for Data Science, Big Data, Business Intelligence and Data Mining and is guest author on leading Big Data sites, is speaker/chairman/panel member on national and international webinars and events and runs a successful series of webinar on Big Data and on Digital Transformation. He has been active in the data (process) management domain for more than 18 years, has founded multiple companies and is now director at a Data Consultancy company, leader in Big Data & data process management solutions. Broad interest in big data, data science, predictive analytics, business intelligence, customer experience and data mining. Feel free to connect on Twitter or LinkedIn to stay up to date on success stories.

The post What Skills Do I Need to Become a Data Scientist? appeared first on Ronald van Loons.


Revolution Analytics

Hurricane Harvey's rains, visualized in R by USGS

On August 26 Hurricane Harvey became the largest hurricane to make landfall in the United States in over 20 years. (That record may yet be broken by Irma, now bearing down on the Florida peninsula.)...

...
InData Labs

Jump-start your career in Data Science with InData Labs and Wargaming!

This September in partnership with famous Wargaming.net we are launching WG Forge - a new educational program for Belarusian students. If you have the passion, determination, and commitment to develop a Data Science career, our program is a great chance for you to learn from the most competent professionals in the industry, develop your skills and start your career.

Запись Jump-start your career in Data Science with InData Labs and Wargaming! впервые появилась InData Labs.


Forrester Blogs

Forrester’s Privacy And Security Forum Brings Diverse Experts To Devious Challenges

Well, the privacy hits keep coming: another breach, more than a hundred million people affected, untold losses for another company and its customers. Next week, September 14-15 in Washington DC,...

...

Forrester Blogs

Cover Your aaS Renewal Discussions

Many of our clients are fast approaching their contractual renewal dates for Cloud and as-a-service (aaS) offerings. They have expressed concerns over vendors changing licensing models and pricing...

...
 

September 07, 2017


Revolution Analytics

In case you missed it: August 2017 roundup

In case you missed them, here are some articles from August of particular interest to R users. Using the featurizeText function in the MicrosoftML package to extract ngrams from unstructured text. A...

...
Silicon Valley Data Science

Themes from JupyterCon 2017

This past August was the first JupyterCon—an O’Reilly-sponsored conference around the Jupyter ecosystem, held in NYC. I attended on behalf of Silicon Valley Data Science (SVDS) and presented a poster. We make extensive use of Jupyter (Notebook, Hub, nbconvert, etc.) in our data science consulting work and love to show our support for open source projects. JupyterCon was one of the best conferences that I’ve been to, and I learned a great deal from the few days that I was there. There were several themes that presented themselves during the conference that I would like to highlight:

  • reproducible science and collaboration
  • Jupyter for teaching
  • future possibilities for Project Jupyter

In this post I will present a number of talks grouped by their themes, with some thoughts surrounding them.

Reproducible (data) science and collaboration

The Jupyter Project grew out of the IPython framework that was started by an academic (Fernando Perez) as an “afternoon hack.” From the beginning, the project focused on how to better use computational tools to solve problems faced by working scientists. This pedigree shows through to today in many ways—reproducibility and collaboration are key concepts in science, and were addressed by a number of talks at JupyterCon.

The following two keynotes spoke at high level about collaboration and reproducibility.

The next two talks get more into implementation specifics.

  • In Design for reproducibility, Lorena Barba tackled the challenge of reproducibility directly.
  • In How Jupyter makes experimental and computational collaborations easy, Zach Sailer explained how his collaboration combines the various pieces of the Jupyter ecosystem (what he called orbit) to develop, communicate, and share their science. His slides are available here, with a YouTube video hopefully coming online soon.
  • I presented a poster based on my previous work on collaboration for data science teams. A number of people stopped by during the poster session to chat about the challenges they face in working on teams and to share ideas for solutions.

Jupyter for teaching

Jupyter Notebooks allow teachers to give students a document that interleaves narrative and description with interactive code snippets and challenges. This suggests an excellent pedagogical tool when properly used. Of course, deploying code that is meant to be altered by students on their own laptops with every conceivable hardware configuration can be a daunting task. A number of talks spoke to how they tackled this challenge.

The future of Jupyter/JupyterLab

What’s next for the Jupyter Project? Through the conference, the message was JupyterLab, a new frontend to many of the tools that exist in the Jupyter ecosystem. JupyterLab was demoed in tutorials and talks throughout, and the newest version (0.27) was released the first day of the conference.

What this means is that Jupyter Notebooks aren’t going anywhere: they feature prominently within JupyterLab. Having played around with earlier versions of JupyterLab, I was very happy with the newest release as it feels like it has come a long way. A few talks from the Jupyter team demonstrated what JupyterLab offers.

  • I recommend looking at JupyterLab: The next-generation Jupyter frontend when it comes out on YouTube. The core developers of the Jupyter Project demonstrated the newest version of JupyterLab. They did a nice job selling the improvements over a simple Jupyter Notebook server.

Other future thoughts centered around deploying something interactive from Jupyter so that other users could gain insight from some analysis.

Wrapping up

Finally, I want to highlight a talk that doesn’t really fit any of these themes, but simply blew me away.

  • A billion stars in the Jupyter Notebook by Maarten Breddels. Keep an eye out for the video, but let me assure you that it’s far more than simply plotting a billion stars. The talk demonstrated the amazing capabilities of several visualization libraries; from the conference description: “Maarten Breddels offers an overview of vaex, a Python library that enables calculating statistics for a billion samples per second on a regular n-dimensional grid, and ipyvolume, a library that enables volume and glyph rendering in Jupyter notebooks. Together, these libraries allow the interactive visualization and exploration of large, high-dimensional datasets in the Jupyter Notebook.” Maarten fully delivered on these promises!

Overall, an excellent conference and I learned a lot. Were you there? Tell us about your favorite sessions in the comments.

The post Themes from JupyterCon 2017 appeared first on Silicon Valley Data Science.


BrightPlanet

Greater Data Harvest Opportunities with the New Rosoka Update

For years, BrightPlanet has been proud to partner with Rosoka, an industry leader of text analytics solutions. We enjoy working with Rosoka because of their commitment to bringing unique value to organizations by helping them automatically identify the important entities in unstructured data. Rosoka recently made some innovative updates to their software. These updates are […] The post Greater Data Harvest Opportunities with the New Rosoka Update appeared first on BrightPlanet.

Read more »

Forrester Blogs

Data Commercializers Do It Differently: It’s Best To Be Prepared

We know that the data economy is going to be huge.  Data fuels innovation, and having that unique data point (or “alt data”) brings incremental value and competitive advantage. Demand for data is...

...
Ronald van Loon

Digital Transformation in Healthcare: A Practical Success Case

Digital transformation continues to penetrate the paradigms of every industry. Whether it is the manufacturing industry, retailing industry or the service industry, the footprints of digital transformation can be found across all.

Yet, amidst the great disruption, a resistance fosters. A resistance, that is coming from the healthcare sector.

Dr. Robert Wachter, describes it as:

“An inability to understand how computerization would utterly change the work”

To elaborate, many people working in the healthcare industry have failed to understand the context in which digital transformation should be assisting them. They perceive the disruption as siphon – a force that takes away from them what makes them best at their work. That is:

  • The human touch.
  • The sense of ownership.
  • The familiarity index.

When one listen to the concerns, the first thing that springs in mind:

If digital transformation is about migrating from analog to digital, it shouldn’t really “siphon” out the core essence of the established practices.

The description of the resistance echoes.

Perception is the problem, and as such, practical implementation could be the only solution.

Meet Mr. Guilherme Rabello

I, together with my friends, Jim Harris and Eric Kavanagh, looked across; looked far and beyond, and our search took us to Frankfurt. There, at SAP Leonardo Live Event in Frankfurt, we got the opportunity to interview Mr. Guilherme Rabello, the Commercial and Market Intelligence Manager at InovaInCor, Latin America’s biggest healthcare complex for cardiac disease treatment and research.

His organization, has successfully embarked and piloted a digital transformation project, with Guilherme Rabello being a primary member of the team that worked on the project.

The Decision to Go Digital

When asked, as to what prompted Mr. Guilherme and his team to embark on this digital journey, he explained the challenges that the healthcare industry in general, and InovaInCor in specific, faces:

“Basically, the idea came from the doctors themselves. Inside a typical ICU we have a lot of equipment, plugged into the patient. Most of these patients are suffering from chronic conditions, and the cases are of complex nature at large. The equipment, connected to these patients, give doctors and nurses vital and critical information that help in their treatment. Unfortunately, these equipment are not interconnected. They cannot talk to each other, which can be a problem for the medical team, which is deployed to manage large ICU, especially, in a case like ours, where we have 150 beds in the ICU. The doctors and nurses, have to bundle all the information, from the medical record and extract the data. They have to go from one bed to another, to write down the data and transcript it in medical records. This takes time and makes the workflow inefficient.” 

To summarize, one of the primary challenges that a medical team in an ICU has to face, is the acquisition and transcription of medical data. Since, the equipment lack interconnectivity, doctors and nurses have to be constantly on the round and must keep the records updated, which is crucial in an ICU setting.

This begs the question:

What causes this lack of interconnectivity?

Mr. Guilherme Rabello explained:

“This is because, all the systems are proprietary. Systems from Siemens, GE, Toshiba etc. cannot communicate with modules of different brands. And, for a hospital it is impossible to have equipment of only one brand due to technological silos and budget restraints. Furthermore, if two equipment are different, like a cardiac monitor and an infusion pump, the connectivity cannot be established at all. This means, you have to record and monitor readings from each unit individually, which creates a burden for all.”

“So, the ICU director said, that why not create a system that just connects to all these devices. The system could then display the updated information, from all the equipment, of all different brands, on one single accessible dashboard. This way, doctors and nurses would be able to save their time, and channelize it towards delivering improved patient care to the people.” 

However, it wasn’t just the tediousness, involved in data acquisition workflow, that was the concern. There were other challenges as well that needed addressing. Mr. Guilherme Rabello expands on them:

“Data personalization is another challenge that we wanted to address. Acquisition and transcription of data is all good; you also need to have access to personalized data in an ICU setting, if you want to administer effective intervention. The system need to have a feature, that would provide the exact data that I would want, not all of them.”

He continues to talk about the challenges:

“Sometimes, there are cases, where a young doctor is present in an ICU. If the young doctor observes a critical vital go unstable, there is every chance that the doctor may panic due to lack of experience. A quick intervention is needed. What should the doctor do? Yes, doctors are trained about the protocols but in a situation of panic, a doctor may become confused and may completely forget the protocol. We cannot leave things to chance. This was another challenge that we had to address.”

Working Towards the Solution

Mr. Guilherme Rabello and his team contacted equipment manufacturers and asked, if they could help with the solution. Although, many manufacturers came forward, claiming that they had the solution, but none worked.

Until Mr. Guilherme Rabello approached SAP.

“A team of 50 members was formed, 30 people from InovaInCor and 20 people from SAP. The people who came from InovaInCor, were all physicians, surgeons and nurses. This lead to a design thinking approach for the solution. People who had the problem, became part of helping to find the solution. We worked for six weeks and came up with a prototype. The prototype that we built, we used it in our own hospital.”

The Solution

Briefing the features of the prototype, Mr. Guilherme told:

We finally came up with a system, thanks to the programming and engineering expertise of the team from SAP and the input from our medical team. This system can be installed at the nurse station, which gives information of all the patients, based on data points extracted from every equipment. The dashboard also allows doctors and nurses to personalize the data appearing across the touch screen. This way, if I have to do an intervention, I can access the dashboard and see the critical information to know what is exactly happening with the patient. This would then enable me to intervene quickly, without having to go through the extensive medical records. Furthermore, the system also allows doctors to access pre-fed protocols from their tablets, as a result of which, they can deliver quality treatment in a short time, without having to face any hurdle.”

The Results of the Pilot Testing

The prototype that was tested, yielded following quantifiable results, which Mr. Guilherme describes as remarkable.

“In the initial phases of testing, we calculated that the pilot deployment helped us to reduce nurse workload by 85 percent. Currently, our nurses actively serve 1000 hours per month, just in the collection and input of data. The saved time can be leveraged to improve personal interaction with the patients, thus helping to improve patient care. Moreover, we also managed to improve the turnover of ICU beds by 7 percent. This means, that if deployed on large scale, the system has the potential to save us $42000 each day on our ICU budget. The money that would be saved, can then be utilized for further improving our services and expanding our service capacity.”

Mr. Guilherme Has a Message

And, Mr. Guilherme has a message for the rest of healthcare community:

“What I think is important is that, in medical community they need to look at technology as their supporter and something that does not create more burden. And right now, most doctors see technology as a burden; something that drags their time out of the patient’s treatment. I hope that this example, could serve to change that behavior. Technology should assist, technology should enable. And, if this approach is something that companies embrace, I believe we can move forward to create important solutions for everyone.”

This brings me to the end of this post. Digital transformation helped the medical team at InovaInCor, to deliver improved medical care without having them to compromise on human touch, sense of ownership or familiarity index.

The time saved through technology adoption, allowed nurses to interact more regularly with the patients. The personalization of data, that the system allowed, gave doctors the sense of ownership. The system which was devised by adopting the design thinking approach, allowed practitioners to work with technology that they were familiar with.

This was a practical success case of digital transformation in healthcare. I hope, this would serve as an encouragement for the community on whole, to come forward and embrace the benefit that the transformation has to offer. Watch the whole interview on YouTube.

About the Author

Ronald van Loon is an Advisory Board Member and Big Data & Analytics course advisor for Simplilearn. He contributes his expertise towards the rapid growth of Simplilearn’s popular Big Data & Analytics category.

If you would like to read more from Ronald van Loon on the possibilities of Big Data and the Internet of Things (IoT), please click “Follow” and connect on LinkedIn and Twitter.

Ronald

Ronald helps data driven companies generating business value with best of breed solutions and a hands-on approach. He has been recognized as one of the top 10 global influencers by DataConomy for predictive analytics, and by Klout for Data Science, Big Data, Business Intelligence and Data Mining and is guest author on leading Big Data sites, is speaker/chairman/panel member on national and international webinars and events and runs a successful series of webinar on Big Data and on Digital Transformation. He has been active in the data (process) management domain for more than 18 years, has founded multiple companies and is now director at a Data Consultancy company, leader in Big Data & data process management solutions. Broad interest in big data, data science, predictive analytics, business intelligence, customer experience and data mining. Feel free to connect on Twitter or LinkedIn to stay up to date on success stories.

More Posts - Website

Follow Me:
TwitterLinkedIn

Author information

Ronald helps data driven companies generating business value with best of breed solutions and a hands-on approach. He has been recognized as one of the top 10 global influencers by DataConomy for predictive analytics, and by Klout for Data Science, Big Data, Business Intelligence and Data Mining and is guest author on leading Big Data sites, is speaker/chairman/panel member on national and international webinars and events and runs a successful series of webinar on Big Data and on Digital Transformation. He has been active in the data (process) management domain for more than 18 years, has founded multiple companies and is now director at a Data Consultancy company, leader in Big Data & data process management solutions. Broad interest in big data, data science, predictive analytics, business intelligence, customer experience and data mining. Feel free to connect on Twitter or LinkedIn to stay up to date on success stories.

The post Digital Transformation in Healthcare: A Practical Success Case appeared first on Ronald van Loons.

 

September 06, 2017


Revolution Analytics

Knime 3.4 connections to Microsoft R, Azure

Version 3.4 of the Knime Analytics Platform, the open-source data science workflow toolbox, was released back in July. With that release came new integrations with Azure and Microsoft R Server, which...

...

Forrester Blogs

CDOs, CAOs (or Equivalent), We Want You!

Data leadership is proliferating. More than ½ of companies now have a chief data officer (CDO), and an almost equal number have a chief analytics officer (CAO). Many of those without these data...

...
decor