Advanced Data Management

Workflow for cleansing, normalizing, organizing, and enriching data

Cutting-edge data extraction and cleansing capabilities

Deep Insight analytics for Knowledge Base Construction (KBC)

World-class data enrichment and data quality monitoring

This is what you've been wishing you could do with your data

Fractal's Advanced Data Management (ADM) provides the key to unlocking your data's potential. Textual data in any format - from websites across the Internet to any document in your enterprise - is easily cleaned, enriched, organized, integrated, and even augmented so you can take full advantage of it.

Enrich your data with tools to easily incorporate feedback from validated users and experts from across the world. Transfer Learning algorithms leverage lessons learned from previous analyses to draw similar conclusions about different but related data. Natural Language Processing and Deep Insight analytics extract sophisticated relationships between entities, including emotional state from sentiment analysis. Advanced modeling identifies knowledge gaps and recommends ways to address them. The result is a highly contextualized Knowledge Base that lays bare all the ways your data is interconnected, exposing dark data or hidden value by revealing complex correlations and dependencies within and across datasets.

ADM provides the framework for an unprecedented understanding of your data, and the tools to maximize its value.


Example Use Cases

Cutting-Edge Data Extraction and Cleansing Capabilities

Data Cleaning

Machine learning algorithms and distributed computing are applied to standardize data arrangement for consumption and analysis.

Data Deduplication

Natural Language Processing (NLP) algorithms are leveraged to recognize and correct duplicate and redundant data from both structured and unstructured sources.

Data Normalization

Real-time data processing capabilities allow data to be rendered uniformly (e.g. conversion of all currencies to US dollars, even on streaming data feeds).

Data Semantification

Data is automatically converted to standardized ontologies or Domain-Specific Languages (DSLs) to enable machines and people to understand, share, and reason with information efficiently at execution time.


Extract, Transform, Load (ETL) activities can be performed on-demand to conform and integrate disparate data sets or data streams into a presentation-ready format.

Deep Insight Analytics for Knowledge Base Construction

Natural Language Processing

NLP analytics are fundamental to applying machine learning capabilities to streaming or batched text data, including part-of-speech tagging, named entity recognition, relationship extraction, sentiment analysis, optical character recognition (converting images to test), and speech-to-text (or text-to-speech) conversion.

Knowledge Base Construction

Deep Insight analytics are leveraged across diverse data sources to extract highly specialized, complex, or even hidden relationships and correlations between entities to build a contextualized Knowledge Base for any observed system.

Text Mining

Derive high-quality information such as relevance, novelty, and interestingness from text by structuring it for pattern recognition and revealing hidden meaning or “dark” data using methodologies such as text categorization, clustering, concept/entity extraction, production of granular taxonomies, sentiment analysis, document summarization, and entity relation modeling.

Sentiment Analysis

ML-driven contextual analysis enables you to confidently derive the attitude of a speaker, writer, or other subject with respect to some topic, whether it be a judgment or evaluation by an author or speaker, their emotional state, or the intended emotional impact of their content.

News Headline Sentiment Classification

With respect to stock portfolio management, for example, ADM allows traders to identify (1) the sentiment polarity of a news feed, (2) the entities or assets named in that news feed, (3) the impact of the news feed on an individual stock or asset in the portfolio, (4) the overall impact of global sentiment on the portfolio, and (5) the potential impact of recommended actions (e.g. buy or sell).

World-Class Data Enrichment and Data Quality Monitoring

Crowdsourced Data Enhancement

Leverage human intelligence via human-in-the-loop (HITL) methodologies to train machine learning algorithms by incorporating widespread assessments such as evaluations of content or approval of search results to improve model performance over time.

Structured Expert Judgement Enrichment

Utilize feedback from Subject Matter Experts (SMEs) within the enterprise or incorporate extracted expert judgement from across the web to enrich existing data or validate specialized query or model results to improve future outcomes.

Synthetic Data Generation

ML-driven analytics reveal gaps within a Knowledge Base where entities or processes may not have been identified but are likely to exist given the overall context of an observed system, and customized tools are used to generate data to fill those gaps for modeling purposes (further enhanced by crowdsourced and expert feedback).

New Data Label Solution (NDLS)

Today's state-of-the-art machine learning models require massive amounts of labeled training data - which usually does not exist for real-world applications - and NDLS rapidly creates, models, and manages these training datasets using methodologies such as expectation and maximization algorithms with varying degrees of supervised learning.

Data Health Monitoring

Define metrics and thresholds to track the performance and health of datasets and data feeds over time, triggering notifications or alerts for interesting events or anomalous behavior based on increasingly precise benchmarks.


Unprecedented Flexibility

Easily integrate applications and services to build complex workflows and automations in minutes.

Simple Usability

Guided component configuration and an intuitive drag-and-drop workflow builder UI are provided in an interactive web portal.

Elevation of Human Talent

Automation of mundane business processes frees up human talent to focus on the things machines can't do.

Improved Accuracy

Automated, data-enriched workflows promote the context to promote the right result or decision the first time.

Increased Consistency

Identical processing based on customizable conditions and triggers reduces output variation.

Greater Productivity

Tedious and repetitive tasks can be performed by virtual full-time employees working 24/7/365.

Adjustably Autonomous

The degree of automation associated with any workflow can be adjusted dynamically to accommodate human intervention as needed.

Dynamic Scalability

Easily add or remove workflow automations or participating components to achieve new integrations or outcomes.

Compliance and Accountability

Automated logging provides comprehensive audit trail of outcomes associated with every step of a workflow.

Fault Tolerant

Distributed, cloud-based deployment ensures high availability of data and services to meet the most demanding SLAs.

Continuous Monitoring

Automatic notifications and interventions are easily implemented to protect against silent failures and ensure timely response.