To install click the Add extension button. That's it.

The source code for the WIKI 2 extension is being checked by specialists of the Mozilla Foundation, Google, and Apple. You could also do it yourself at any point in time.

4,5
Kelly Slayton
Congratulations on this excellent venture… what a great idea!
Alexander Grigorievskiy
I use WIKI 2 every day and almost forgot how the original Wikipedia looks like.
Live Statistics
English Articles
Improved in 24 Hours
Added in 24 Hours
What we do. Every page goes through several hundred of perfecting techniques; in live mode. Quite the same Wikipedia. Just better.
.
Leo
Newton
Brights
Milds

Big data maturity model

From Wikipedia, the free encyclopedia

Big data maturity models (BDMM) are the artifacts used to measure big data maturity.[1] These models help organizations to create structure around their big data capabilities and to identify where to start.[2] They provide tools that assist organizations to define goals around their big data program and to communicate their big data vision to the entire organization. BDMMs also provide a methodology to measure and monitor the state of a company's big data capability, the effort required to complete their current stage or phase of maturity and to progress to the next stage. Additionally, BDMMs measure and manage the speed of both the progress and adoption of big data programs in the organization.[1]

The goals of BDMMs are:

  • To provide a capability assessment tool that generates specific focus on big data in key organizational areas
  • To help guide development milestones
  • To avoid pitfalls in establishing and building big data capabilities

Key organizational areas refer to "people, process and technology" and the subcomponents include[3] alignment, architecture, data, data governance, delivery, development, measurement, program governance, scope, skills, sponsorship, statistical modelling, technology, value and visualization.

The stages or phases in BDMMs depict the various ways in which data can be used in an organization and is one of the key tools to set direction and monitor the health of an organization's big data programs.[4][5]

An underlying assumption is that a high level of big data maturity correlates with an increase in revenue and reduction in operational expense. However, reaching the highest level of maturity involves major investments over many years.[6] Only a few companies are considered to be at a "mature" stage of big data and analytics. These include internet-based companies (such as LinkedIn, Facebook, and Amazon) and other non-Internet-based companies, including financial institutions (fraud analysis, real-time customer messaging and behavioral modeling) and retail organizations (click-stream analytics together with self-service analytics for teams).[6]

YouTube Encyclopedic

  • 1/3
    Views:
    124 292
    1 613
    361
  • What's Your Data Management Maturity?
  • What is a Data Governance Maturity Model? #datagovernance #maturitymodel
  • TDWI's Big Data Analytics Maturity Model - TDWI World Conference Chicago 2013

Transcription

Hi I’m Jared Hillam, I’ve noticed that there seems to be a maturity continuum which nearly every organization moves through in managing data. In this video I’d like to outline this continuum which we see different organizations follow. Now I want to point out up front that this doesn’t represent what ideally should occur within organizations, but rather what naturally occurs while an organization grows in revenue and the habits that are formed throughout this lifecycle. I’ll start with organizations that are roughly 50 Million dollars in revenue. I like to call this the reflexive stage. In this stage, organizations are basically reacting to each request for information on a one off basis. There is little to no reuse in the data gathering process. So every time a manager or director needs updated data or information, it requires a manual effort to deliver it. Often organizations in this stage will have multiple tools being used between individuals, and a lot of man hours spent on what managers might think are simple requests. Organizations in this stage are seeking to create some repeatability in the work they do. So they’ll often deploy Operational Reporting tools that allow standardized reports to be delivered to end users. Also these organization often purchase desktop data analysis tools that allow these one off requests to be more easily delivered and analyzed. The use of these tools often bridge the organizations growth into the Inclusive Stage. In the Inclusive Stage organizations begin to realize they have a lot of operational reports and a lot of desktop analytics with nested assumptions. When a change needs to be made to those assumptions, each individual report needs to be opened and edited. Also, the business users begin to want control of their own analysis and are no longer willing to allow an IT person to be their liaison to their data. In addition they want more direct access to shared logic so they can work as teams. This is where organizations begin adopting Business Intelligence Platforms, and Standardized Business Intelligence Logic that can be shared across users. These platforms allow for business users to log into secured enterprise applications that are built for collaborative analytics and shared logic, which the IT team can maintain. And this is where a critical juncture occurs. If organizations do not quickly transition to the Truthful Stage, they will find themselves believing that their BI platform is faulty or end up incredibly frustrated with the data consumption process. In the Truthful Stage the organization begins to see that the different sources of data need to come together in order to provide a cohesive story. This usually reveals itself in executive meetings where there are multiple versions of the truth and they can’t make decisions. For organizations that have lingered in the Inclusive Stage, they often believe their BI tool is faulty and are seeking its immediate replacement. Replacing the BI tool is usually a mistake, as the problem is really only addressed through a Data Mart or a Data Warehouse implementation, using an ETL tool. Then we enter the Governed Stage. Once a Data Warehouse and BI Platform are in place, organizations begin to increasingly realize that their data is not clean. This becomes apparent when they open a customer, say GE for example, and they find multiple company names that represent GE including ones that are simply spelled incorrectly. Additionally, they will find that many of the records that are being stored will have incorrect mailing addresses and other data quality problems. These wide array of data issues become more obvious as the organizations centralize their data usage. To address this, organizations will adopt data quality tools to make changes at a bulk level and Master Data Management solutions to consolidate their critical attributes into surviving records and hierarchies that the business can agree on and control. The business will commonly organize a data governance team to manage the data. This leads us to the Expansive Stage. As organizations become more and more mature in building data warehouses and governing their data, they quickly realize that the size of the data footprint is slowing down the accessibility of the information. This moves organizations into one of 2 strategies. The first strategy is archiving. This is where inactive data is offloaded and maintained in a separate archived tier, relieving the production system from antiquated records which are rarely accessed, which speeding up the production systems. The second strategy is a performance expansion of the data warehouse called a Data Warehousing Appliance. These appliances are specialized configurations of hardware and software which widen the pipe for larger database queries, allowing billions of rows to be accessed on the fly. These data warehousing appliances are often lumped into a larger category which the market calls Big Data. In the Expansive Stage we also see organizations adopting other Big Data solutions like Hadoop. Hadoop provides a cost effective way of processing large quantities of unstructured data. The architecture behind Hadoop originated from Google’s efforts to deal with their massive indexes while reducing the cost of hardware. Like I said in the beginning, these stages don’t represent an ideal path organizations should follow, but rather a natural path based on the problems an organization faces as they grow. Perhaps in an ideal world you could build a Master Data Management backbone before you rolled out all your applications and you could have your data warehouse in place before you built your business intelligence content. But in the real world, it’s difficult to budget a fix to a problem that isn’t readily noticeable. Now that you can see the whole picture, you probably will identify your organization still dealing with reflexive stage issues, even though your organization is in latter stages. Or perhaps you’ll be dealing with Governed Stage issues and you are still in the Reflexive stage. There really is no cookie cutter model for how organizations should behave but it is important to understand what solutions exist for the problems you are facing today. Intricity is highly aware of these solutions and a very capable partner in helping you plan the future. I recommend you reach out to Intricity and talk with one of our specialists. We take an active role in our client’s strategic roadmap, and we simplify the complexity of its realization.

Categories

Big data maturity models can be broken down into three broad categories namely:[1]

  • Descriptive
  • Comparative
  • Prescriptive

Descriptive

Descriptive models assess the current firm maturity through qualitative positioning of the firm in various stages or phases. The model does not provide any recommendations as to how a firm would improve their big data maturity.

Big data and analytics maturity model (IBM model)

This descriptive model aims to assess the value generated from big data investments towards supporting strategic business initiatives.

Maturity levels

The model consists of the following maturity levels:

  • Ad-hoc
  • Foundational
  • Competitive differentiating
  • Break away

Assessment areas

Maturity levels also cover areas in matrix format focusing on: business strategy, information, analytics, culture and execution, architecture and governance.

[7]

Knowledgent big data maturity assessment

Consisting of an assessment survey, this big data maturity model assesses an organization's readiness to execute big data initiatives. Furthermore, the model aims to identify the steps and appropriate technologies that will lead an organization towards big data maturity.[8]

Comparative

Comparative big data maturity models aim to benchmark an organization in relation to its industry peers and normally consist of a survey containing quantitative and qualitative information.

CSC big data maturity tool

The CSC big data maturity tool acts as a comparative tool to benchmark an organization's big data maturity. A survey is undertaken and the results are then compared to other organizations within a specific industry and within the wider market.[9]

TDWI big data maturity model

The TDWI big data maturity model is a model in the current big data maturity area and therefore consists of a significant body of knowledge.[6]

Maturity stages

The different stages of maturity in the TDWI BDMM can be summarized as follows:

Stage 1: Nascent

The nascent stage as a pre–big data environment. During this stage:

  • The organization has a low awareness of big data or its value
  • There is little to no executive support for the effort and only some people in the organization are interested in potential value of big data
  • The organization understand the benefits of analytics and may have a data warehouse
  • An organization's governance strategy is typically more IT-centric rather than being integrative business-and-IT centric

Stage 2: Pre-adoption

During the pre-adoption stage:

  • The organization start to investigate big data analytics

Stage 3: Early adoption The "chasm" There is then generally a series of hurdles it needs to overcome. These hurdles include:

  • Obtaining the right skill set to support the capability, including Hadoop and advanced analytical skills
  • Political issues, i.e. big data projects are conducted in areas within the organization and trying to expand the effort or enforcing more stringent standards and governance lead to issues regarding ownership and control

Stage 4: Corporate adoption

The corporate adoption stage is characterized by the involvement of end-users, an organization gains further insight and the way of conducting business is transformed. During this stage:

  • End-users might have started operationalizing big data analytics or changing their decision-making processes
  • Most organizations would already have repeatedly addressed certain gaps in their infrastructure, data management, governance and analytics

Stage 5: Mature / visionary

Only a few organizations can be considered as visionary in terms of big data and big data analytics. During this stage an organization:

  • is able to execute big data programs as a well-oiled machine with highly mature infrastructure
  • has a well-established big data program and big data governance strategies
  • executes its big data program as a budgeted and planned initiative from an organization-wide perspective
  • whose employees share a level of excitement and energy around big data and big data analytics

Research findings

TDWI[6] did an assessment on 600 organizations and found that the majority of organizations are either in the pre-adoption (50%) or early adoption (36%) stages. Additionally, only 8% of the sample have managed to move past the chasm towards corporate adoption or being mature/visionary.

Prescriptive

The majority of prescriptive BDMMs follow a similar modus operandi in that the current situation is first assessed followed by phases plotting the path towards increased big data maturity. Examples are:

Info-tech big data maturity assessment tool

This maturity model is prescriptive in the sense that the model consists of four distinct phases that each plot a path towards big data maturity. Phases are:

  • Phase 1, undergo big data education
  • Phase 2, assess big data readiness
  • Phase 3, pinpoint a killer big data use case
  • Phase 4, structure a big data proof-of-concept project

[10]

Radcliffe big data maturity model

The Radcliffe big data maturity model, as other models, also consists of distinct maturity levels ranging from:

  • 0 – "In the dark"
  • 1 – "Catching up"
  • 2 – "First pilot"
  • 3 – "Tactical value"
  • 4 – "Strategic leverage"
  • 5 – "Optimize and extend"

[5]

Booz & Company's model

This BDMM provides a framework that not only enables organizations to view the extent of their current maturity, but also to identify goals and opportunities for growth in big data maturity. The model consists of four stages namely,

  • Stage 1: Performance management
  • Stage 2: Functional area excellence
  • Stage 3: Value proposition enhancement
  • Stage 4: Business model transformation

[4]

Van Veenstra's model

The prescriptive model proposed by Van Veenstra aims to firstly explore the existing big data environment of the organization followed by exploitation opportunities and a growth path towards big data maturity. The model makes use of four phases namely:

  • Efficiency
  • Effectiveness
  • New solutions
  • Transformation

[11]

Critical evaluation

Current BDMMs have been evaluated under the following criteria:[1]

  • Completeness of the model structure (completeness, consistency)
  • The quality of model development and evaluation (trustworthiness, stability)
  • Ease of application (ease of use, comprehensibility)
  • Big data value creation (actuality, relevancy, performance)

The TDWI and CSC have the strongest overall performance with steady scores in each of the criteria groups. The overall results communicate that the top performer models are extensive, balanced, well-documented, easy to use, and they address a good number of big data capabilities that are utilized in business value creation. The models of Booz & Company and Knowledgent are close seconds and these mid-performers address big data value creation in a commendable manner, but fall short when examining the completeness of the models and the ease of application. Knowledgent suffers from poor quality of development, having barely documented any of its development processes. The rest of the models, i.e. Infotech, Radcliffe, van Veenstra and IBM, have been categorized as low performers. Whilst their content is well aligned with business value creation through big data capabilities, they all lack quality of development, ease of application and extensiveness. Lowest scores were awarded to IBM and Van Veenstra, since both are providing low level guidance for the respective maturity model's practical use, and they completely lack in documentation, ultimately resulting in poor quality of development and evaluation.[1]

See also

References

  1. ^ a b c d e Braun, Henrik (2015). "Evaluation of Big Data Maturity Models: A benchmarking study to support big data assessment in organizations". Masters Thesis – Tampere University of Technology.
  2. ^ Halper, F., & Krishnan, K. (2014). TDWI Big Data Maturity Model Guide. TDWI Research.
  3. ^ Krishnan (2014). "Measuring maturity of big data initiatives". Archived from the original on 2015-03-16. Retrieved 2017-05-21.
  4. ^ a b El-Darwiche; et al. (2014). "Big Data Maturity: An action plan for policymakers and executives". World Economic Forum.
  5. ^ a b "Leverage a Big Data Maturity model to build your big data roadmap" (PDF). 2014. Archived from the original (PDF) on 2017-08-02. Retrieved 2017-05-21.
  6. ^ a b c d Halper, Fern (2016). "A Guide to Achieving Big Data Analytics Maturity". TDWI Benchmark Guide.
  7. ^ "Big Data & Analytics Maturity Model". IBM Big Data & Analytics Hub. Retrieved 2017-05-21.
  8. ^ "Home | Big Data Maturity Assessment". bigdatamaturity.knowledgent.com. Archived from the original on 2015-02-14. Retrieved 2017-05-21.
  9. ^ Inc., Creative services by Cyclone Interactive Multimedia Group, Inc. (www.cycloneinteractive.com) Site designed and hosted by Cyclone Interactive Multimedia Group. "CSC Big Data Maturity Tool: Business Value, Drivers, and Challenges". csc.bigdatamaturity.com. Retrieved 2017-05-21. {{cite web}}: |last= has generic name (help)CS1 maint: multiple names: authors list (link)
  10. ^ "Big Data Maturity Assessment Tool". www.infotech.com. Retrieved 2017-05-21.
  11. ^ van Veenstra, Anne Fleur. "Big Data in Small Steps: Assessing the value of data" (PDF). White Paper.
This page was last edited on 9 November 2022, at 16:44
Basis of this page is in Wikipedia. Text is available under the CC BY-SA 3.0 Unported License. Non-text media are available under their specified licenses. Wikipedia® is a registered trademark of the Wikimedia Foundation, Inc. WIKI 2 is an independent company and has no affiliation with Wikimedia Foundation.