You could run the list of hashes into a custom table using the Log Analytics ingestion API (
https://docs.microsoft.com/en-us/rest/api/loganalytics/create-request) or even via a Logic App (
https://docs.microsoft.com/en-us/connectors/azureloganalyticsdatacollector/), if that data changes pretty often I am not sure how practical that will be and you will also pay for the ingestion costs each time you ingest it. If you do decide to go that route once the data is ingested it will be the quickest for querying for sure.
You can query external data in KQL without ingesting it, e.g if you had a csv file sitting in Azure blob storage you can query it directly, see an example here -
https://techcommunity.microsoft.com/t5/azure-sentinel/using-external-data-sources-to-enrich-network-.... Not sure how practical that is going to be across multiple and large files though, worth a shot to see how it performs though.
Have you thought about using threat intelligence to hunt for bad hashes / domains / IP addresses etc instead of retaining a list of all known good ones? Another option for you perhaps -
https://docs.microsoft.com/en-us/azure/sentinel/understand-threat-intelligence