50,000 Websites and 10,000 Hours:
The Story of Skim AI

The two numbers that drove the mission of Skim AI: 50,000 websites are published every day with new information to mine and after 10,000 hours of practice, I became an expert mining them. Skim AI's goal? To help researchers make informed decisions, faster.

I spent more than 6 years reading articles from hundreds of publications in order to detect shifting behavioral and geo-political trends permeating across various industries and markets. During that time, I became fascinated with the idea that valuable data was hidden inside very lengthy but well researched news and magazine articles and how to identify it quickly. They say it takes 10,000 hours to become an expert at something so after a few years as a macro-economic advisor, I became an expert at detecting and extracting valuable data points from large articles, fast.

How? I noticed patterns in the way the sentences containing valuable data points were worded and structured and this hunch led us to build a data-extraction algorithm that works on the news.

We are living in the age of information overload. Which leads to decision paralysis, which then leads to outsourcing of decision making to “experts” that also have time constraints and unknown motivations. Outsourcing decision making due to inadequate research tools is something that always irked me. The way to improve decision making and hold “experts” or policy makers accountable is to make tools that can instantly extract data on any subject from written text. Signal searchers, if you will. That’s what we have been on a mission to provide since Skim AI was incorporated in 2017.

Every day 50,000 new websites are published that contain news and other data.

I dare to generalize that anyone researching anything has a similar motive – to find the most impactful information and to be the first one of your competitors to do so. So that means you need to know where to look, how to look and what to look for. But even after 10,000 hours of research, it’s impossible for an expert to keep up with the amount of data that is being pushed out every day. I wanted to build something to help individuals process “internet scale” amounts of information, find what they need and allow them to make better decisions and grasp tomorrow’s potential, today.

Over a span of 18 months, my business partner and I built and trained an algorithm based on verbal patterns to detect facts in written text. The goal was to create a tool that could scrape the massive amounts of information published daily on the internet and help knowledge workers find what they needed faster.

Today, Skim AI is democratizing access to machine learning and natural language processing tools so that you can uncover the trends of tomorrow right now. The machine learning aspect means that our future of work tools will continuously improve and expand in ability based on the needs of our users and everything we do is to support them in their research efforts. Whether an MBA student preparing for a case or an investment banker tracking prospects, Skim AI was built to be your trusted research assistant. 

See why hundreds of people are using Skim AI to help them make better decisions, faster.

en_USEnglish