Case Studies
AI / MACHINE LEARNING WITH A PURPOSE

Task-Focused AI

Lemuridae Labs had a challenge to leverage modern AI technologies to aid in the discovery and analysis of a wide range of information. Large Language Models (LLMs) are powerful tools, but these must be focused and targeted to avoid wasting money, to avoid producing inaccurate information, and to produce a quality product for the users.

In this solution, the team applied the techniques of Retrieval Augmented Generation (RAG) and a data refinement pipeline to create information chunks, called embeddings, usable by the AI models. When joined with other metadata, the RAG processes can be powerful and produce accurate and engaging information summaries.

Although the project was internally focused, the final solution produced a data summarization solution that leveraged a range of latent and current information sources to ensure accuracy in data for organizational consumers.

REAL-TIME WITH REAL-RESULTS

Internet of Things (IoT) Stream Processing

Our team worked with a customer who had a distributed workforce collecting data from sites, through manual and automated means, and needed to process and aggregate this information. The data flow rate of information was highly variable, and needed to support various device types, versions, and capabilities across the fleet of connected devices. The devices, whether connected through cellular backhauls and local networks, needed to publish data, get updates, and go offline to retain battery life.

Lemuridae Labs worked to incorporate the data into a stream processing solution, transforming, normalizing, and evaluating data that was subsequently processed into the data repository. The fusion of sensor data, geospatial data, and device metadata provided the necessary information to drive decision making and other overarching support activities.

With this effort, data security, integrity, and provenance was critical to ensure that rogue, spoofed, or replayed data was not impacting the system, and could otherwise be detected and addressed. Any risks to data quality would impair the trust of the system overall, and invalidate the effort. The tracking, validation, and management of data through the aggregate process was critical to meet both the functional and data integrity requirements of the effort.

Building Success,
One Project at a Time.
Today is the day we can build something together, expanding and collaborating to create something new.
Start Now