Skip to content

Introducing HOOP Lake – a Data Centric Approach to Cyber Security operations – Supercharge YOUR DATA with Amazon Security Lake.

Cyber Security is fundamentally a Data Problem – HOOP Lake Is set to change the way Security teams identify and combat emerging threats across all threat vectors.

The HOOP Lake approach, powered by your data & hosted in Amazon Security Lake, improves and simplifies the way Cyber experts can search and automate the whole ecosystem of security events through categorisation, normalisation, and enrichment…

Stream

Simply bring your own data

Our data processor logic automatically receives log information from your data sources, and transforms this data into your target format, optimised and enriched for store, search and compliance.   We focus on the OCSF and OSSEM standards, but also support others such as CIM.  The purpose of our streamer is that it provides extremely high throughput and manipulation of data, based on how that log source needs to be treated.  For example, we can enrich the stream with regulatory or threat intelligence data, we can truncate keywords and we can consolidate duplicate records with unique timestamps.

Store

Efficient store for your data

Your normalised and enriched data is stored in a compressed and optimised format, allowing for common access and efficient search and the information is stored in a high performance DB with automatic compress/uncompress.  We leverage Parquet tables to provide a high level of compression and high performance indexing, whilst our stream pre-event has already normalised the data for it to be stored in the most efficient manner.

Enrich

Orchestrate your data flows

Our orchestrator allows you to order and manipulate your ingest data sources based on your unique data set requirements.  For example, your search actions may require additional fields to be captured, and our orchestrator will automatically add this to the streaming function, or you may want data enriched to a new regulatory standard, so we simply add additional components to the stream.  Alternatively, you may want to enrich data prior to normalisation, or normalise before archiving.  The HOOP Orchestrator uses modular blocks which allows streams to be manipulated without the need to re-write your streaming code.

Comply

Compliance metrics at your fingertips

Your data is automatically enriched at point of ingestion, making on the fly dashboard reporting and visualisation simple.  Whether you want to report in NIST or MITRE frameworks, our streaming process automatically categorises data based on your needs, making observability built in as standard.  As we have a highly scalable stream and store process, our compliance dashboards are created in real time, using live data which provides the most accurate view of your estate.

Hoop-Lake-01