LOGIQ.AI advances machine data management with DPaaS offer

LOGIQ.AI launched LogFlow, an Observability as a Service (DPaaS) data pipeline. LogFlow works in machine data management and is designed to enable businesses to use machine data by connecting it to SMBs on demand.

Greg O’Reilly, Observability Consultant at Visibility Platforms, says: “LogFlow enables our customers to take a whole new approach to observability data; an approach that helps to regain control and unlock the limitation of suppliers or costs.

“We are opening discussions between ITOps and security teams for the first time with a unified solution that keeps data secure, compliant, manageable and easily accessible to those who need it on the front lines. “

LOGIQ.AI CEO and Co-Founder Ranjan Parthasarathy said, “Unfortunately, businesses have been sold as smart features to counter the backpressure and upstream downtime of data pipelines.

“Block and drop is data loss in disguise. Imagine losing a vital signature in your log feed that indicates impending ransomware is starting to spread. Don’t introduce new business risks by bulk buy and drop. “

According to the company, LogFlow eliminates blocking and deletion by storing all streaming data in InstaStore, a storage innovation that enables object storage as primary storage.

In InstaStore, the data is fully indexed and searchable in real time. LogFlow also stores its indexes in InstaStore, providing a scalable platform with fully decoupled storage and compute.

LogFlow ingests data even when upstream targets are down. Due to its indexing capabilities, it provides fine-grained data rebroadcasts.

Jay Swamidass, APAC and EMEA Sales Manager for LOGIQ.AI, says, “InstaStore introduces a new paradigm for data agility that eliminates data loss and the need for storage tiering and data rehydration.

“Businesses can now unlock productivity, cost reduction and compliance like never before. “

Additionally, LogFlow’s native support for open standards simplifies the collection of machine data from any source. Similar to network flows, LogFlow manages data with its routing table at the flow level.

LOGIQ.AI Co-Founder Tito George says, “LogFlow filters out unwanted data and detects in-flight safety events. Users can route streams, control EPS, and perform fine-grained data replays.

“InstaStore’s indexing and columnar data arrangement enables faster queries, unlike archive formats like gzip. “

Open source tools such as Fluent Bit and Logstash can already route data between various sources and target systems and enable the routing of raw archives to object stores.

The complex issues to solve are: controlling the volume and spreading of data, preventing data loss, ensuring data reuse with precise control, and ensuring business continuity during upstream outages.

Theodore Caroll, LOGIQ.AI Americas Sales Manager, says, “There is no technical reason to accept less than 100% data availability.

“Your data is your only real fortress in responding to threats and adverse business events. Businesses need a system like LogFlow that ensures complete data replay is available continuously and indefinitely. “

LogFlow’s built-in “rule sets” contain more than 2,000 rules that filter, tag, extract, and rewrite data for the most common customer environments and workloads. They also allow detection and tagging of security events.

Overall, LOGIQ.AI’s LogFlow brings full control over observability data pipelines and delivers high-value, high-quality data to the teams who need it in real time, all the time, says the business. Organizations can fully control the collection, consolidation, retention, manipulation and management of upstream data flows.

Comments are closed.