Databricks customers previously needed to manually configure S3 bucket Integrations and create Databricks notebooks from scratch to get and parse data into Databricks environment. The Databricks Notebooks created would simplify the operation and reduce the manual tasks for customers to configure data collection and parsing. The splunk App will allow customers to run queries and jobs in the form of search queries and reduce dependency on access to Databricks Instance for running them.
Databricks Inc, is an Data + AI company based in San Francisco, California. The company provides a unified and collaborative cloud based data analytics platform based on modern lakehouse architecture which combines the best of data warehouses and data lakes.
Databricks customers previously needed to manually configure S3 bucket Integrations and create Databricks notebooks from scratch to get and parse AWS Cloud Trail , AWS VPC logs and Syslogs data into Databricks environment for further processing and analytics. Also the customers previously needed to manually create jobs and queries from UI to run them.
Crest Data Systems wrote Databricks notebooks to collect and parse AWS Cloud Trail , AWS VPC logs and Syslogs data from S3 buckets into Databricks environment be used for further processing. Crest created Databricks Notebooks to push specified data from Databricks to ingest into Splunk and pull data present into Splunk to Tables in Databricks environment. Crest also helped build Splunk app for Databricks which allows splunk admins to run queries in Databricks tables and execute Databricks Jobs and notebooks using custom commands from Splunk. The following custom commands were implemented:
The Databricks notebooks helped