We create value from data
Domain expertise. Policy and business strategy. Novel AI/big data tools. All in one place.
We work together with you to identify the problem at hand, help you formulate the right questions, scout for the right data, whether this is inside your organisation or open data outside, transform it to actionable processes
Big data and AI in support of decision making: The path into taking the right decision in policy or in business is not always straightforward. The use of big data has infiltrated our world and new technologies open a window of opportunities. But these are also loaded with uncertainty and serendipity as they need to be tailored to the problem at hand, and also be tested and verified by human knowldege and expertise.
Human-in-the-loop: Our team combines Artificial Intelligence and domain expertise to build tools that embed our experts' knowledge in the technology (after all, NLP and Machnice Learning do what we tell them to do), and include human interaction checkpoints for validation and adaptation (learning).
Starting with our domain expertise and knowledge
We tackle the issue to understand how to best approach it. With our expertise and partners we are able to apply qualitative and quantitative methodologies, complementing each other where necessary.
When data enters the picture, we explore ways on how your organisation's data or data from other sources can be used to extract information and derive knowledge, how it can be connected, and present a targeted AI-assisted solution. Always by applying our tools for cleaning and merging, while ensuring GDPR compliance and privacy.
...and build tailor-made solutions to respond to specific needs
by embedding domain knowledge and expertise that comes from our team and your needs in our AI tools; by using big data mechanisms / graph analysis to make inferences otherwise hidden due to volume of data; by applying customised and interactive visualisations to be able to detect the patterns, trends, anomalies.
1. Collect dataData comes in many shapes, sizes and colours, e.g., textual, numerical, visual, etc. Choosing the appropriate input datasets, acquiring it, and bringing it to a processable form is the first step to ensure a high-quality output in the end.
2. Prepare dataMaking sense out of the data acquired is a painstaking process requiring both tedious and intelligent processes. Value is added by pooling, merging, cleaning, transforming, projecting, linking, and semantically annotating data from different initial sources into larger and/or more cohesive datasets, also employing text and data mining techniques to extract information. Furthermore, the type of model that will be used for the problem at hand is formulated afresh or chosen from a family of appropriate pre-specified or learned model types.
3. Train modelsMachine learning algorithms are applied to analyse the data, training and producing models as the key result of the process. The suite of algorithms includes topic modelling, correlation and causality modelling, regression modelling, and others.
4. Test ModelsStandard techniques are applied to validate the predictive power of the models produced, possibly feeding back into Data Prep and Training for appropriate adjustments in cases of validation failures.
5. Deploy pipeline
The models are fed into OPIX tools and systems that embed specialised algorithmic and interaction logic that is particular to public policy and business strategy consulting. Knowledge and insights are extracted and visualised appropriately to convey the hidden “data messages” for evidence-based decision making. Examples include various metrics and indicators, trends analysis, and others.
6. Produce inference and insightsAs new relevant data or cases arise the models are applied to produce relevant predictions for various measures. These may then possibly be used as additional input data for further modelling.