Discovery
UpGuard can now report that it has secured an Elasticsearch database for AngelSense, a GPS tracker and assistive technology for children and adults with special needs. The database contained 30 indexes of log files, each of which contained around 120 million entries. Those log files contained data generated during the sale and operation of AngelSense systems, including shipping addresses, contact information, partial credit card numbers, passwords, GPS coordinates, and metadata related to the AI decision engine for determining the status of the people being monitored.
UpGuard analysts detected the potential data exposure on January 17, 2025. On the same day, UpGuard emailed AngelSense. On January 24, when the database was still accessible, UpGuard called AngelSense’s US phone number and spoke with a support person who said they had received the email and it had been escalated for review. On the morning of January 27, UpGuard confirmed the database was no longer accessible.
Exposed data
The data collection was organized into 53 different indexes, 30 of which were sizable collections of log files. In total, those indexes contained over 3 billion objects with a storage size of 1.8 terabytes. Downloading the complete data set was not feasible, but using the Elasticsearch query system, UpGuard analysts were able to identify and download samples of data illustrating the different kinds of exposure.
Each object was a log message from an AngelSense system. Logs can contain all kinds of information. Many logs are not particularly interesting in isolation, documenting system actions that require context to be meaningful. In this case, however, there were also log files from the “Production_US” system that contained personal information for the buyers of the AngelSense offering and the people whom they wanted to protect.
Purchase details
One type of data found in the logs was purchases of the AngelSense hardware GPS tracker and/or the software subscription service. The data included in some, but not all cases, shows the buyer’s name, shipping address, email address, phone number, credit card expiration date, credit card last four digits, and whether there was a sponsoring organization involved.


User passwords
Some logs included user account information including the name of the user, their email address, password, and the “angel” inviting them to the platform. The passwords were unique across users and satisfied password complexity standards for length and mix of characters, suggesting these passwords were intended for real use.


End user descriptions and location data
The AngelSense system allows guardians to track the location of people with special needs and communicate with them if they are in danger. Some of the logs capture the GPS coordinates of the users being tracked and information about their movement used to infer their location like their speed, heading, and steps per minute. The account data for the devices includes data about the person being monitored including their year of birth and a list of the special needs like autism, dementia, down syndrome, or other conditions.


AI decision-making
AngelSense operates by determining whether a monitored person is in peril, “utilizing AI to overcome the inherent limitations of GPS.” Log files recorded the AI decisions, summarizing changes in geolocation data into an ostensible narrative of that person’s activity.

Partner organizations
AngelSense partners with, and is distributed by, U.S. police departments and other community organizations that provide care for people with special needs. The partner organizations were documented through the account information, which sometimes included the name of the organization and the SSIDs of their wireless networks.

Analysis
Using technology to make vulnerable people safer is a laudable mission, but handling the data for such a project creates an equal responsibility for care. Personally identifiable information, payment information, and user passwords are well recognized classes of data that need to be kept private. One way to reduce the risk of handling these types of data is to ensure they are only stored in a database, not sent to logging systems in plaintext.
Perhaps more disconcerting is the exposure of the location data for the individuals being monitored. The significance of exposing this data can be gauged by looking at recent efforts to ensure its privacy, like in the California Consumer Privacy Act or the stalled bipartisan Geolocation Privacy and Surveillance Act, or the risk it poses when obtained illicitly, like in the Gravy Analytics breach.
The exposed data also illustrates how AI creates the risk of exposing an additional layer of insights on top of the raw data collected. In theory, AI is not adding any new data here, but it is creating information by using that data to better estimate the user’s location and their status. Turning a set of latitude and longitude coordinates into the story of a person's life is the promise of what AI can do for tracking applications like this. It also means that the exposure of AI-inferred decisions can reveal more about the people it affects.
AngelSense helps protect some of society’s most vulnerable people, but also illustrates the need to determine what level of security practices should be required of such a service. Their Trust and Security page says they encrypt data in transit and at rest, which are good practices; but AngelSense’s Privacy and Security page does not mention auditing against holistic security frameworks like ISO 27001 or SOC2. While the hardware costs of creating GPS tracking devices have become quite low—similar products can be found on Amazon as low as $10—such systems still require investment in the security and privacy practices to safely handle the data produced by them.