Artificial Intelligence For Event Prediction and Anomaly Detection
We’ve built an artificial intelligence that learns. It creates a multivariate and complex view of the world based on the data it ingests and is able to do temporal reasoning, and learning on the fly, without models to train and deploy, nor rules to create.
The more data it observes, the smarter it gets. This makes it very fast and cost-effective to deploy compared to other predictive analytics solutions. We can run on devices of all sizes, from tiny Arduinos the size of a quarter, to massive clusters of high performance computers. This helps our customers increase efficiency and reduce costs by pushing their predictive analytics to the edges of their network.
Fast and Cost-Effective Deployment
- Our AI does all the hard work for you. No need to hire data scientists to create, deploy, and maintain predictive models. Deploy our AI once, and you’re done.
- Our AI starts learning as soon as it starts observing your data, and it can learn from historical data if you have it.
- No need for Subject Matter Experts to create complex rules to monitor your systems, our AI learns what normal behavior is, and can even learn normal complex and cyclic behaviors.
- If you do have an existing rules-based system, our AI can ingest those, too, so that you can include alerting based on existing rules, as well as anomalous or predictive behavior observed by our AI.
AI at the edges of the IoT
- On the tiniest devices (such as Arduinos), our AI does anomaly detection by learning on-the-fly based on the sensor data it sees. It can do alerting, and send its data to a larger computer for further analysis.
- On small computers, our AI creates a multivariate knowledge based comprised of data from nearly any source, and is able to do event prediction, anomaly detection, and cooperative learning with other devices.
- On larger servers and clusters, our AI is able to ingest massive amounts of data to create a complex view of the world for the most accurate predictions, pattern detection, multi-variate clustering and classification, and dynamic relationship discovery.
- Rather than sending all your sensor data to a central repository, you have the option of sending compressed summaries of the normal sensor readings, saving power, bandwidth, and infrastructure costs.
Flexible, Scalable, Smart, Real Time Predictions
- Flexible: Our AI is data agnostic, which means it functions well in nearly any vertical or industry, whether it be healthcare data, network data, financial data, or sensor data.
- Scalable: Built from the ground up to be scalable and efficient, our AI supports trillions of data points across cooperating clusters of computers that can range in geography, computing power, OS, and manufacturer.
- Smart: Our AI reduces false alarms, preventing alarm fatigue and saving time spent troubleshooting things that aren’t actually problems, and alerts you to future problems in time to prevent disasters.
- Real time: Our AI supports data ingestion of millions of data points per second, and millisecond analysis that includes all the data, both stream and historical.
Simularity pitches at the Cisco Grand Innovation Challenge at IoT World Forum in Dubai, December 2015.
Simularity’s CEO, Liz Derr, spoke at Caffeet 2015 – here’s some “take aways” and a peek at our AI detecting live anomalies
Simularity is now a General Member of the Intel® Internet of Things Solutions Alliance and will be attending the Intel Developers Forum in San Francisco, August 18-20.
Meet Ray Richardson, our Chief Technology Officer. Ray will speak on “Practical Predictive Analytics on Time Series Data using SAX” at the Machine Learning Conference in Seattle May 1, 2015.
Gabriel Sidhom, Vice President of Technology Development at Orange Silicon Valley, will present a case study using Simularity’s predictive analytics at the Telco Big Data Summit in Las Vegas, September 10, 2014