This topic gives an overview of how we collect data from data breaches.
We get this information from many sources by collecting data sets that have been breached and leaked using multiple techniques. Our primary focus is using a human approach coupled with scanning and monitoring technologies that can access forums on the deep and dark web.
We collect a massive amount of data on a daily basis. Once we have the data, we put it through a rigorous quality-control process to determine its value. This includes sorting the data into file types, types of data, and getting rid of data that is not useful. We may also get duplicate data that we already acquired from a previous breach, so the data also goes through a process to eliminate these duplicates.
Once the data collection and controls are done, we run checks on the authenticity of the data. The final phase in handling this type of data is to transfer the data analysis into the report.