Is anyone currently using the Cloudflare Data Connector?

Copper Contributor

I am trying to connect the Cloudflare data connector to my sentinel instance and am not having any luck at all. 

 

I pulled it in to our test environment from the Hub, and I am able to get the Cloudflare logs down to our storage account. I have tried connecting and reading logs from both a daily folder inside the container, and just have the blobs dropped directly into the container with no luck. I am consistently getting failures in the App Insights for the function.

 

Exception while executing function: Functions.AzureFunctionCloudflare Result: Failure
Exception: ValueError: Connection string missing required connection details.
Stack: File "/azure-functions-host/workers/python/3.8/LINUX/X64/azure_functions_worker/dispatcher.py", line 398, in _handle__invocation_request
call_result = await self._run_async_func(
File "/azure-functions-host/workers/python/3.8/LINUX/X64/azure_functions_worker/dispatcher.py", line 617, in _run_async_func
return await ExtensionManager.get_async_invocation_wrapper(
File "/azure-functions-host/workers/python/3.8/LINUX/X64/azure_functions_worker/extension.py", line 147, in get_async_invocation_wrapper
result = await function(**args)
File "/home/site/wwwroot/AzureFunctionCloudflare/main.py", line 51, in main
container_client = conn._create_container_client()
File "/home/site/wwwroot/AzureFunctionCloudflare/main.py", line 82, in _create_container_client
return ContainerClient.from_connection_string(self.__conn_string, self.__container_name, logging_enable=False, max_single_get_size=2*1024*1024, max_chunk_get_size=2*1024*1024)
File "/home/site/wwwroot/.python_packages/lib/site-packages/azure/storage/blob/_container_client.py", line 242, in from_connection_string
account_url, secondary, credential = parse_connection_str(conn_str, credential, 'blob')
File "/home/site/wwwroot/.python_packages/lib/site-packages/azure/storage/blob/_shared/base_client.py", line 397, in parse_connection_str
raise ValueError("Connection string missing required connection details.")

 

I generated the Blob Connection String from the storage account > Shared Access Signature > and generated the connection string from there.

 

I am new to Azure, coming over from Splunk so trying to learn how all of this stuff functions together.


Thanks!

4 Replies

@mwhitener - Did you ever get this working?  I'm getting the same error as you.  

 

Thanks,

Bajcsi

For what it is worth, I think just taking the SAS url is not the correct way. I found this site:

https://docs.microsoft.com/en-gb/azure/storage/common/storage-configure-connection-string. There is a section on using SAS for Blob storage access, and the connection string looks more like (their example):

BlobEndpoint=https://storagesample.blob.core.windows.net;
SharedAccessSignature=sv=2015-04-05&sr=b&si=tutorial-policy-635959936145100803&sig=9aCzs76n0E7y5BpEi2GvsSv433BZa22leDOZXX%2BXXIU%3D

@bajcsi 

 

I ended up figuring it out. I will say that I had to do the manual implementation through VSCode as we had many different accounts that we had to monitor. It does create a resource group for each one when I did that way. If you want more details you can reach out to me in a PM if that is possible on here.

@mwhitener

To make this work the answer for what the AZURE_STORAGE_CONNECTION_STRING value needs to be is that it needs to be the Access Keys -> Connection String from the storage account which contains the container where the Cloudflare logs are being LogPushed to.

 

And NOT a Shared Access Key of any form as would be assumed, as such it should take the form :

DefaultEndpointsProtocol=https;AccountName=storageaccountname;AccountKey=xxxxxxxxxxxxxxxxxxxxxxxx;EndpointSuffix=core.windows.net

Hope this helps others