Oracle
13 TopicsOracle 2.0 Upgrade Woes with Self-Hosted Integration Runtime
This past weekend my ADF instance finally got the prompt to upgrade linked services that use the Oracle 1.0 connector, so I thought, "no problem!" and got to work upgrading my self-hosted integration runtime to 5.50.9171.1 Most of my connection use service_name during authentication, so https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle?tabs=data-factory, I should be able to connect using the Easy Connect (Plus) Naming convention. When I do, I encounter this error: Test connection operation failed. Failed to open the Oracle database connection. ORA-50201: Oracle Communication: Failed to connect to server or failed to parse connect string ORA-12650: No common encryption or data integrity algorithm https://docs.oracle.com/error-help/db/ora-12650/ I did some digging on this error code, and the troubleshooting doc suggests that I reach out to my Oracle DBA to update Oracle server settings. Which, I did, but I have zero confidence the DBA will take any action. https://learn.microsoft.com/en-us/azure/data-factory/connector-troubleshoot-oracle Then I happened across this documentation about the upgraded connector. https://learn.microsoft.com/en-us/azure/data-factory/connector-oracle?tabs=data-factory#upgrade-the-oracle-connector Is this for real? ADF won't be able to connect to old versions of Oracle? If so I'm effed because my company is so so legacy and all of our Oracle servers at 11g. I also tried adding additional connection properties in my linked service connection like this, but I have honestly no idea what I'm doing: Encryption client: accepted Encryption types client: AES128, AES192, AES256, 3DES112, 3DES168 Crypto checksum client: accepted Crypto checksum types client: SHA1, SHA256, SHA384, SHA512 But no matter what, the issue persists. :( Am I missing something stupid? Are there ways to handle the encryption type mismatch client-side from the VM that runs the self-hosted integration runtime? I would hate to be in the business of managing an Oracle environment and tsanames.ora files, but I also don't want to re-engineer almost 100 pipelines because of a connector incompatability.Solved6.1KViews3likes15CommentsPowerShell Script Failing with Auth Header and 500 Internal Server Error for REST API
Hi everyone, I'm encountering multiple issues with the PowerShell script that interacts with a REST API to execute batch jobs in FDMEE. The script is supposed to send an HTTP request with a Basic Authorization header, but I'm facing the following problems: "Invalid or Missing Authorization Header" Error: When I visit the API URL directly in the browser, I get: { "links": [], "status": 9, "details": "EPMFDM-ERROR: Invalid or Missing Authorization Header in request" } 2."Internal Server Error (500)": When running the script, it often goes to the catch block and displays a 500 Internal Server Error. Here's the error message I receive in PowerShell: PS>TerminatingError(Invoke-RestMethod): "Request failed." Error encountered in PowerShell Script. Here is the script I'm using: #HTTP Basic Authorization. Contains encrypted username and password encoded to base64 string. $headers = @{Authorization = 'Basic encryptedpassword';} # Set parameter values $jobName = $args[0] $uri = http://server.comm.iocs.address.com:0000/aif/rest/V1/jobs # Monitor status of current batch run Write-Output "Checking Job Status..." Start-Sleep -Seconds 5 $restResponse = Invoke-RestMethod -Uri $uri -Method Get -Headers $headers -ContentType "application/json" $lastJobID = $restResponse.items[0].jobID $payload = @{ jobType = "BATCH" jobName = $jobName } | ConvertTo-Json # Establish REST connection and execute batch job using REST API $restResponse = Invoke-RestMethod -Uri $uri -Method Post -Body $payload -Headers $headers -ContentType "application/json" $uri = $restResponse.links[0].href # Display initial status of batch Write-Output "See below status of batch run..." $restResponse = Invoke-RestMethod -Uri $uri -Method Get -Headers $headers -ContentType "application/json" $currentJobID = $restResponse.jobID Write-Output "Last Job ID: $lastJobID" Write-Output "Current Job ID: $currentJobID" } catch { Write-Output "Error encountered in PowerShell Script.." Write-Output $_.Exception.Message if ($_.InvocationInfo) { Write-Output "Error in script: $($_.InvocationInfo.ScriptName)" Write-Output "Error on line: $($_.InvocationInfo.ScriptLineNumber)" Write-Output "Error in command: $($_.InvocationInfo.Line)" } if ($_.Exception.Response) { Write-Output "HTTP Status Code: $($_.Exception.Response.StatusCode.Value__)" Write-Output "Status Description: $($_.Exception.Response.StatusDescription)" Write-Output "Response Content: $($_.Exception.Response.Content)" } exit 1 } Despite my efforts, the request still fails with the "Invalid or Missing Authorization Header" error and occasionally hits a 500 Internal Server Error. Here are the steps I've taken to debug the issues: Checked Base64 Encoding: Confirmed that the credentials are correctly encoded in Base64. Verified Header Format: Ensured that the Authorization header is correctly formed and included in the request. Tested with Postman: Manually tested the API request with Postman using the same Authorization header, and I'm getting the same header authorization error. Added Detailed Error Logging: Included more detailed error logging in the catch block to capture HTTP status codes and response content. I'm looking for advice on what might be causing these issues in the PowerShell script and how I can resolve them. Any insights or suggestions would be greatly appreciated!1.8KViews0likes4CommentsSAP on Oracle ASM on Azure in 5 Easy Steps
Customers moving SAP on Oracle systems to Azure are strongly advised to follow clear guidance from Oracle, Microsoft and SAP to move to Automatic Storage Management (ASM). There are many performance, administration and support benefits with ASM. Customers that have used a migration from on-prem to Azure as an opportunity to move to ASM have given consistent positive feedback that ASM is the best storage solution for Oracle.Connect to an Oracle database.
Hello everyone. I am new to Excel development and need to connect to an Oracle database. I am using Excel 2019 (64 bit) on Windows 10 (64 bit). I downloaded the Oracle client and installed it. I set up the TNS file to connect to the database, but even after all the configurations Excel warns that the Oracle components are missing for the connection. What should I do for Excel to "find" the Oracle components to connect?16KViews0likes10CommentsOracle VM / Azure Backup / Application Consistent/ Script to monitor when database is running/frozen
Hello everyone, I’ll need help with a technical context. I have a Linux 7/ Azure VM "Oracle DB" on which I test Microsoft Azure Backup (without Azure File Share for Point-in-Time recovery) My Oracle database is running. I have my /etc/azure/workload.conf (by default) My json file with VMSnapshotPluginConfig.json in the directory. (https://github.com/MicrosoftAzureBackup/VMSnapshotPluginConfig) Azure Portal : • Snapshot : 8 minutes • Application Consistent : 4 I want to know when the Oracle database is frozen (8 minutes seems long) I find nothing concrete in the logs of the Linux VM (directory alert.log) I saw this Github repo that allowed you to customize the script with output codes : https://github.com/MicrosoftAzureBackup/Oracle/blob/master/script.sh With a command sh -x script.sh, I saw : [root@VM2-Test /]# sh -x /scripts/script.sh + config_file_path= + pre_or_post= + success=0 + error=1 + warning=2 + status=0 + log_path=/config_error.log + '[' -eq 0 ']' /scripts/script.sh: line 15: [: -eq: unary operator expected + '[' -a ']' + . /scripts/script.sh: line 21: .: filename argument required .: usage: . filename [arguments] I believe that I have missing elements in my code and that in addition, I will not have the times when the database is started/ frozen/ stopped. If anyone can help me with my problem, that would be nice I also saw this way to collect times I want but I'm not sure to query correctly after setting up. https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/virtual-machines/workloads/oracle/oracle-database-backup-azure-backup.md#remove-the-database-files sqlplus / as sysdba SQL> CREATE PROCEDURE sysbackup.azmessage(in_msg IN VARCHAR2) AS v_timestamp VARCHAR2(32); BEGIN SELECT TO_CHAR(SYSDATE, 'YYYY-MM-DD HH24:MI:SS') INTO v_timestamp FROM DUAL; DBMS_OUTPUT.PUT_LINE(v_timestamp || ' - ' || in_msg); SYS.DBMS_SYSTEM.KSDWRT(SYS.DBMS_SYSTEM.ALERT_FILE, in_msg); END azmessage; / SQL> SHOW ERRORS Any assistance would be most welcome. Have a good day !2.3KViews0likes3CommentsSAP on Azure General Update August 2022
Woolworths Australia goes live on Azure, new recommendations for SAP on Oracle direct from Oracle themselves. Disk Bursting and VM Bursting and new SSD v2 in preview. SQL Server 2022, ODBC 18 and Windows 2022 News for Windows customers and news for Linux customers Important guidance for sizing boot disks on Linux.