kubernetes
11 TopicsBuilt a Real-Time Azure AI + AKS + DevOps Project – Looking for Feedback
Hi everyone, I recently completed a real-time project using Microsoft Azure services to build a cloud-native healthcare monitoring system. The key services used include: Azure AI (Cognitive Services, OpenAI) Azure Kubernetes Service (AKS) Azure DevOps and GitHub Actions Azure Monitor, Key Vault, API Management, and others The project focuses on real-time health risk prediction using simulated sensor data. It's built with containerized microservices, infrastructure as code, and end-to-end automation. GitHub link (with source code and documentation): https://github.com/kavin3021/AI-Driven-Predictive-Healthcare-Ecosystem I would really appreciate your feedback or suggestions to improve the solution. Thank you!89Views0likes2CommentsAzure Container Registry - New comic
- You are a Cloud lover? - But you prefer Azure? - Learning with fun? Maybe you'll like the last Azure Container Registry comic provided by Jules&Léa: If you want to deep dive, do not hesitate to visit the official documentation on Microsoft: https://learn.microsoft.com/en-us/azure/container-registry/container-registry-intro/?WT.mc_id=AZ-MVP-5005062 ++1.2KViews0likes0CommentsUnable to configure log directory of Spark History Server to Storage Blob when deployed on AKS
I am trying to deploy Spark History Server on AKS and wanted to point it's log directory to Storage Blob. For achieving this, I am putting my configs on the values.yaml file as below:- wasbs: enableWASBS: true secret: azure-secrets sasKeyMode: false storageAccountKeyName: azure-storage-account-key storageAccountNameKeyName: azure-storage-account-name containerKeyName: azure-blob-container-name logDirectory: wasbs:///test/piyush/spark-history pvc: enablePVC: false nfs: enableExampleNFS: false First, I am creating the azure-secrets using the below command:- kubectl create secret generic azure-secrets --from-file=azure-storage-account-name --from-file=azure-blob-container-name --from-file=azure-storage-account-key After that, I am running the following set of commands:- helm repo add stable https://kubernetes-charts.storage.googleapis.com helm install stable/spark-history-server --values values.yaml --generate-name But while doing go, I am getting the following error:- 2020-10-05 19:17:56 INFO HistoryServer:2566 - Started daemon with process name: 12@spark-history-server-1601925447-57c5476fb-5wh6q 2020-10-05 19:17:56 INFO SignalUtils:54 - Registered signal handler for TERM 2020-10-05 19:17:56 INFO SignalUtils:54 - Registered signal handler for HUP 2020-10-05 19:17:56 INFO SignalUtils:54 - Registered signal handler for INT 2020-10-05 19:17:56 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2020-10-05 19:17:56 INFO SecurityManager:54 - Changing view acls to: root 2020-10-05 19:17:56 INFO SecurityManager:54 - Changing modify acls to: root 2020-10-05 19:17:56 INFO SecurityManager:54 - Changing view acls groups to: 2020-10-05 19:17:56 INFO SecurityManager:54 - Changing modify acls groups to: 2020-10-05 19:17:56 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 2020-10-05 19:17:56 INFO FsHistoryProvider:54 - History server ui acls disabled; users with admin permissions: ; groups with admin permissions Exception in thread "main" java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.deploy.history.HistoryServer$.main(HistoryServer.scala:280) at org.apache.spark.deploy.history.HistoryServer.main(HistoryServer.scala) Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.azure.NativeAzureFileSystem not found at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195) at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2654) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2667) at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:94) at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2703) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2685) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:373) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:364) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:295) at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:117) at org.apache.spark.deploy.history.FsHistoryProvider.<init>(FsHistoryProvider.scala:86) ... 6 more Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.azure.NativeAzureFileSystem not found at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101) at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193) ... 16 more Any type of help or suggestion would be appreciated. Thanks in advance! Regards, Piyush Thakur3.7KViews0likes4CommentsHow to limit access to docker registry port on Azure VM port to Azure kubernetes cluster ?
I have a docker registry running on a Azure VM. I have a Azure kubernetes cluster. The docker images are to be pulled from that docker registry. How do I limit the access to the registry port to that particular cluster?648Views0likes0CommentsCreate Azure Kubernetes Service Cluster easily - Twitch Stream
I recently sat down for a twitch stream where I provide you with a guide on creating Azure resources for your application deployment. You are provided with a script and an application to build a simple, stateless app on Azure Kubernetes Service without any local software installation. You'll see how to create a service principal for Azure Active directory, apply the username and password to the AKS cluster and configure a trust to the container registry for your Docker Images. You'll see me provide you with instructions to containerize a React JS application and then load the yaml manifests in with kubectl to deploy your images. Here are all the important links: Code and App: https://github.com/jaydestro/aks-acr-all-in-one https://github.com/jaydestro/react-clock-basic Supported Kubernetes versions in Azure Kubernetes Service (AKS) https://docs.microsoft.com/en-us/azure/aks/supported-kubernetes-versions?WT.mc_id=code-github-jagord Overview of Azure Cloud Shell https://docs.microsoft.com/en-us/azure/cloud-shell/overview?WT.mc_id=code-github-jagord Azure Kubernetes Service - A Beginner's Guide. https://dev.to/azure/azure-kubernetes-service-a-beginner-s-guide-mkc1.2KViews0likes0CommentsOn-Call Nightmares Podcast - Episode 34 - Xander Grzywinski - Microsoft
X gonna give it to ya! Xander from the Microsoft Azure Kubernetes SRE Team joins me to talk about his history on-call and more! Xander is a Site Reliability Engineer at Microsoft, he currently slings containers on Azure Kubernetes Service. Previous to Microsoft, he did all the things with retail tech at both Starbucks and Target. You are always welcome to send him your favorite cat pictures. Full Transcript: https://aka.ms/AA5r8ja @XanderGrzy https://github.com/salaxander741Views0likes0CommentsHow to deploy YAML file to Azure kubernetes cluster
Hello i am newbie to seaching about kubernetes . I just create Kubernetes cluster and read this link https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/aks/http-application-routing.md to try deploy ingress but i am not know how to deploy YAML file to it , please guide help me Best Regards, Thanks13KViews1like1CommentAzure Kubernetes Service (AKS) - Shared Storage on Linux Containers
Hi We are working with Azure Kubernetes Service (AKS) and need a shared storage that several Linux Containers can use. Using Azure Files gives us a shared storage with use of the SMB protocol. But our experience is that this is not optimized for Linux Containers. The "Azure Files" gives us a "windows filesystem" that our linux containers (running drupal) do not like to talk with - Character & directory mismatch. Have we done anything wrong configuring this Shares Storage? How can we get a AKS Shared Storage that is optimized to use with linux containers. Currently we are looking on Amazon S3 to get this working, but having Azure AKS with Amazon S3 is not a good design. Br. Rune1.7KViews0likes1Comment