Entity resolution incorporating data from various data sources which uses tokens and normalizes records
Abstract:
A pair of records is tokenized to form a normalized representation of an entity represented by each record. The tokens are correlated to a machine learning system by determining whether a learned resolution already exists for the two entities. If not, the normalized records are compared to generate a comparison measure to determine whether the records match. The normalized records can also be used to perform a web search and web search results can be normalized and used as additional records for matching. When a match is found, the records are updated to indicate that they match, and the match is provided to the machine learning system to update the learned resolutions.
Information query
Patent Agency Ranking
0/0