IT PARK
    Most Popular

    GPT-4 will allow users to customize the "personality" of the AI, making the avatar a real "person"

    Aug 14, 2023

    Uncover 10 big data myths

    Sep 02, 2023

    Healthcare Explores Cloud Computing Market: Security Concerns Raise, Multi-Party Collaboration Urgently Needed

    Sep 28, 2023

    IT PARK IT PARK

    • Home
    • Encyclopedia

      What is brute force cracking?

      Oct 01, 2023

      What is the reason for the computer card? How to deal with the computer card?

      Sep 30, 2023

      Which is better, laptop, desktop or all-in-one

      Sep 29, 2023

      icloud space is always insufficient to do

      Sep 28, 2023

      What is the difference between the Guid format and MBR format for computer hard drive partitioning?

      Sep 27, 2023
    • AI

      What are the young people interacting with Japan's "Buddhist AI" seeking and escaping from?

      Oct 01, 2023

      Nvidia Announces GH200 Superchip, Most Powerful AI Chip, to Accelerate Generative AI Workloads

      Sep 30, 2023

      Google has categorized 6 real-world AI attacks to prepare for immediately

      Sep 29, 2023

      Samsung considers replacing Google search with Bing AI on devices

      Sep 28, 2023

      Generative AI designs unnatural proteins

      Sep 27, 2023
    • Big Data

      What are the misconceptions in data governance in the digital age?

      Oct 01, 2023

      What is a data warehouse? Why a Data Warehouse?

      Sep 30, 2023

      What is Data Governance? Why do organizations need to do data governance?

      Sep 29, 2023

      Winning Business Excellence with Data Analytics

      Sep 28, 2023

      Has the development of big data come to an end?

      Sep 27, 2023
    • CLO

      How to Reduce the Risk of Cloud Native Applications?

      Oct 01, 2023

      How should the edge and the cloud work together?

      Sep 30, 2023

      Last-generation firewalls won't meet cloud demands

      Sep 29, 2023

      Healthcare Explores Cloud Computing Market: Security Concerns Raise, Multi-Party Collaboration Urgently Needed

      Sep 28, 2023

      Remote work and cloud computing create a variety of endpoint security issues

      Sep 27, 2023
    • IoT

      Berlin showcases smart city innovations

      Oct 01, 2023

      IoT solutions lay the foundation for more effective data-driven policing

      Sep 30, 2023

      CO2 reductions won't happen without digital technology

      Sep 29, 2023

      4 Effective Ways the Internet of Things Can Help with Disaster Management

      Sep 28, 2023

      6 Ways the Internet of Things Can Improve the Lives of Animals

      Sep 27, 2023
    • Blockchain

      Which is better for the logistics industry and blockchain

      Oct 01, 2023

      Will blockchain revolutionize the gaming industry?

      Sep 30, 2023

      How do you make a blockchain investment?

      Sep 29, 2023

      What is the connection between blockchain and Web 3.0?

      Sep 28, 2023

      Canon Launches Ethernet Photo NFT Marketplace Cadabra

      Sep 27, 2023
    IT PARK
    Home » Big Data » Why do 85% of Big Data projects end up in failure?
    Big Data

    Why do 85% of Big Data projects end up in failure?

    Companies tend to make their Big Data projects large in size and scope when implementing them, but the truth is that most Big Data projects usually end up in failure.
    Updated: Sep 13, 2023
    Why do 85% of Big Data projects end up in failure?

    In 2016, Gartner estimated that about 60 percent of all big data projects would fail. A year later, Gartner analyst Nick Heudecker said that figure was "too conservative" and that the failure rate for big data projects should be closer to 85 percent. And he still thinks so.In fact, the number of customers successfully applying Hadoop is likely to be less than 20, maybe even less than 10. This is a shocking result considering how long it has been around and how much the industry has invested.

    Anyone familiar with big data knows that the problem is real and serious, and not entirely technical. In fact, technology is a secondary cause of failure relative to the essential cause. Here are four major reasons why big data projects fail, and four ways they can succeed.

         Problem #1: Poor integration

    Heudecker says there is a major technical problem behind big data failures, and that is integrating isolated data from multiple sources to achieve the data processing power that enterprises need. Establishing connections to isolated legacy systems isn't easy. The cost of integration is five to ten times the cost of software, he said. One of the biggest problems is simple integration: How do you link multiple data sources together? Many people choose the data lake route, thinking it's easy, but that's not the case.

    Isolated data is part of the problem. Customers tell him that when they pull data from systems into a common environment like a data lake, they can't figure out what the values mean.

          Problem #2: Unclear Goals

    Most people assume that companies will have clear goals when they undertake big data projects, but that's not really the case. Many companies usually start the project first and then think about the goals. You have to take a hard look at this," says Ray Christopher, product marketing manager for Talend, a data integration software company. People think they can connect structured and unstructured data to get the information they need. However this has to be targeted in advance, what kind of information do you want?"

          Problem #3: Skills Gap

    Too often, companies think that the internal skills they build for data warehousing will translate to big data, and that's not the case. For starters, data warehouses and big data process data in completely opposite ways: data warehouses execute schema on write, which means data is processed and organized before it enters the data warehouse.

    In Big Data, data is accumulated and read patterns are applied, and data is processed as it is read. Therefore, if data processing moves from one method to another, the skills and tools should do the same.

         Problem #4: Technology Generation Gap

    Big data projects often take data from old data shafts and try to merge them with new data sources (such as sensors, web traffic, or social media). This isn't entirely the fault of enterprises, which collected this data before big data analytics came along, but it's a problem anyway.

    According to Greenbaum, the biggest skill that enterprises are missing is how to merge these two data sources so that they can work together to solve complex problems. Data silos can be a barrier to big data projects because it doesn't have any standards. As a result, when companies start planning, they find that these systems have not been implemented in any way, so the data will be reused.

         Solution 1: Plan ahead

    It's a cliché, but it applies to big data projects. Successful companies are inevitably the ones with results, choosing something small and achievable and new to plan and implement. says Morrison, "They need to think about the data first and model the enterprise in a machine-readable way so that the data serves that enterprise."

         Solution 2: Working Together

    Shareholders are often left out of big data projects . heudecker says that if all shareholders work together, they can overcome many obstacles. Adding technical staff working together and collaborating with business units to deliver actionable results can help.

    Christopher believes that big data projects should be made a team sport, with everyone helping to curate and collect the data and process it to improve the integrity of the data.

         Solution 3: Narrow the focus

    People seem to have the mindset that big data projects require very big moves. But just like the first time you learn anything, the best way to succeed is to start small and then scale up over time.

    "They should carefully define what they're doing," Heudecker says, "and should pick a problem domain and look at solving it, such as fraud detection, segmenting customers, or figuring out what new products are being introduced in the millennial market."

         Solution 4: Leave Tradition Behind

    It's important to avoid being interested in an existing infrastructure only because an enterprise has a license for it. Often, new complex problems may require new complex solutions. Using an enterprise's old tools from the past is not the right approach and may even lead to the failure of big data projects.

    Morrison believes that enterprises should stop sticking to their old ways. He also said that enterprises can no longer rely on vendors to solve complex system problems for them. "For decades, many people seem to have assumed that any big data problem is a systemic problem. But when faced with complex system changes, companies must build their own solutions.

    Data analysis problems solutions
    Previous Article Internet of Things and the Elderly
    Next Article CO2 reductions won't happen without digital technology

    Related Articles

    Big Data

    Production control equipment maps, multi-source data analysis issues in-depth mining

    Sep 22, 2023
    Big Data

    What is the value of data analysis?

    Sep 03, 2023
    Big Data

    Do I need to know Python to learn Big Data?

    Sep 10, 2023
    Most Popular

    GPT-4 will allow users to customize the "personality" of the AI, making the avatar a real "person"

    Aug 14, 2023

    Uncover 10 big data myths

    Sep 02, 2023

    Healthcare Explores Cloud Computing Market: Security Concerns Raise, Multi-Party Collaboration Urgently Needed

    Sep 28, 2023
    Copyright © 2023 itheroe.com. All rights reserved. User Agreement | Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.