migration issue
7 TopicsFix Broken Migrations with AI Powered Debugging in VS Code Using GitHub Copilot
Data is at the heart of every application. But evolving your schema is risky business. One broken migration, and your dev or prod environment can go down. We've all experienced it: mismatched columns, orphaned constraints, missing fields, or that dreaded "table already exists" error. But what if debugging migrations didn’t have to be painful? What if you could simply describe the error or broken state, and AI could fix your migration in seconds? In this blog, you’ll learn how to: Use GitHub Copilot to describe and fix broken migrations with natural language Catch schema issues like incorrect foreign keys before they block your workflow Validate and deploy your database changes using GibsonAI CLI Broken migrations are nothing new. Whether you're working on a side project or part of a large team, it’s all too easy to introduce schema issues that can block deployments or corrupt local environments. Traditionally, fixing them means scanning SQL files, reading error logs, and manually tracking down what went wrong. But what if you could skip all that? What if you could simply describe the issue in plain English and AI would fix it for you? That’s exactly what GitHub Copilot let you do, right from within VS Code. What You Need: Visual Studio Code Installed Account in GitHub Sign up with GitHub Copilot GibsonAI CLI installed and logged in Let’s Break (and Fix) a Migration: Here’s a common mistake. Say you create two tables: users and posts. CREATE TABLE users ( id UUID PRIMARY KEY, name TEXT, email TEXT UNIQUE ); CREATE TABLE posts ( id UUID PRIMARY KEY, title TEXT, user_id UUID REFERENCES user(id) ); The problem? The posts table refers to a table called user, but you named it users. This one-word mistake breaks the migration. If you've worked with relational databases, you’ve probably run into this exact thing. Just Ask a GitHub Copilot: Instead of troubleshooting manually, open Copilot Chat and ask: “My migration fails because posts.user_id references a missing user table. Can you fix the foreign key?” Copilot understands what you're asking. It reads the context and suggests the fix: CREATE TABLE posts ( id UUID PRIMARY KEY, title TEXT, user_id UUID REFERENCES users(id) ); It even explains what changed, so you learn along the way. Wait — how does Copilot know what I mean? GitHub Copilot is smart enough to understand your code, your errors, and even what you’re asking in plain English. It doesn’t directly connect to GibsonAI. You’ll use the GibsonAI CLI for that, but Copilot helps you figure things out and fix your code faster. Validating with GibsonAI Once Copilot gives you the fixed migration, it’s time to test it. Run: gibson validate This checks your migration and schema consistency. When you're ready to apply it, just run: gibson deploy GibsonAI handles the rest so no broken chains, no surprises. Why This Works Manual debugging of migrations is frustrating and error prone. GibsonAI with GitHub Copilot: Eliminates guesswork in debugging You don’t need to Google every error Reduces time to fix production schema issues You stay in one tool: VS Code You learn while debugging Whether you're a student learning SQL or a developer on a fast moving team, this setup helps you recover faster and ship safer. Fixing migrations used to be all trial and error, digging through files and hoping nothing broke. It was time-consuming and stressful. Now with GitHub Copilot and GibsonAI, fixing issues is fast and simple. Copilot helps you write and correct migrations. GibsonAI lets you validate and deploy with confidence. So next time your migration fails, don’t panic. Just describe the issue to GitHub Copilot, run a quick check with GibsonAI, and get back to building. Ready to try it yourself? Sign up atgibsonai.com Want to Go Further? If you’re ready to explore more powerful workflows with GibsonAI, here are two great next steps: GibsonAI MCP Server – Enable Copilot Agent Mode to integrate schema intelligence directly into your dev environment. Automatic PR Creation for Schema Changes – The in-depth guide on how to automate pull requests for database updates using GibsonAI. Want to Know More About GitHub Copilot? Explore these resources to get the most out of Copilot: Get Started with GitHub Copilot Introduction to prompt engineering with GitHub Copilot GitHub Copilot Agent Mode GitHub Copilot Customization Use GitHub Copilot Agent Mode to create a Copilot Chat application in 5 minutes Deploy Your First App Using GitHub Copilot for Azure: A Beginner’s Guide That's it, folks! But the best part? You can become part of a thriving community of learners and builders by joining the Microsoft Student Ambassadors Community. Connect with like minded individuals, explore hands-on projects, and stay updated with the latest in cloud and AI. 💬 Join the community on Discord here and explore more benefits on the Microsoft Learn Student Hub.140Views2likes2CommentsMigrate SharePoint 2013 Pages to SharePoint Online
I am attempting to migrate SharePoint 2013 pages to SharePoint Online and include all content and web parts on those pages. I haven't been able to complete this with a copy using File Explorer, using SharePoint Designer, or using the SharePoint Migration tool. Is there something I'm missing on moving web part pages from SharePoint 2013 to SharePoint Online? Is it even possible or am I going to have to create pages manually.Solved12KViews0likes8CommentsM-Files to Sharepoint Online file migration
Does anybody have any insights how to to migrate files from M-Files file management system to SharePoint? Especially any 3rd party tools that could automate the file transferring in the same way as Sharegate or such. I have tried to search information from multiple migration software vendors but with no luck.6.5KViews0likes8CommentsHybrid mailbox migration | Get-MoveRequest shows waiting for job pickup for almost 7 days
Hi All, Hybrid migration Ex2k16 CU latest and all other mailbox batch migrated without any issue except 1 mailbox failed and details shows 104 items were skipped. Move single user in a batch and 10GB capacity and 400Mbps internet still shows "synced status" in GUI and PowerShell shows - "WaitingForJobPickup" and 0% completed even after ran Resume-MoveRequest including -BadItemLimit 200 -AcceptLargeDataLoss. MS support no help and no premier support too. Appreciate a comment in case if you experienced this situation before with Hybrid migration.3.6KViews0likes1CommentHow to add ALL containers to Stream migration tool?
Has anyone migrated their videos using the migration tool yet? I added some containers to the tool and when attempting to scan the container I get "Unable to access". No containers show up in the migration "Scans" section. Do I really have to add all containers for the entire tenant manually? If so, what is the easiest way to accomplish that?Solved3.9KViews0likes10CommentsCross-Tenant OD4B -> OD4B Migration With Modern Auth. MigrationWiz Won't Work - SPO PS?
Hi, We've acquired a company who also has an O365 Tenant. I've had a great experience using BitTitan MigrationWiz to migrate the mailboxes over. It was a dream and I highly recommend it. Phase II was to perform a document migration, and the process was that all source users would move everything into their OD4B, and I would use the doc migration component of MigrationWiz to forklift all data out and over into their new OD4B sites in our tenant. MigrationWiz supports this migration. However. In our (the 'destination') tenant we have implemented Modern Authentication & Conditional Access Policies, along with some MFA. I got rid of the MFA for the migration acct, and excluded it from the Conditional Access Policies. When I tried the pilot migration I got the below error: Your migration failed checking destination credentials. Cannot contact web site 'https://TENANTNAME-admin.sharepoint.com/' or the web site does not support SharePoint Online credentials. The response status code is 'Unauthorized'. The response headers are 'Content-Type=text/plain; charset=utf-8, P3P=CP="ALL IND DSP COR ADM CONo CUR CUSo IVAo IVDo PSA PSD TAI TELo OUR SAMo CNT COM INT NAV ONL PHY PRE PUR UNI", X-SharePointHealthScore=3, X-MSDAVEXT_Error=917656; Access+denied.+Before+opening+files+in+this+location%2c+you+must+first+browse+to+the+web+site+and+select+the+option+to+login+automatically., SPRequestDuration=27, SPIisLatency=4, X-Powered-By=ASP.NET, MicrosoftSharePointTeamServices=16.0.0.7716, X-Content-Type-Options=nosniff, X-MS-InvokeApp=1; RequireReadOnly, X-MSEdge-Ref=Ref A: A023B5802BE4460D83753583C1F81C92 Ref B: BL2EDGE0918 Ref C: 2018-05-30T18:36:12Z, Date=Wed, 30 May 2018 18:36:11 GMT, Content-Length=0'. BitTitan told me that this is because their tool only supports Set-SPOTenant -LegacyAuthProtocolsEnabled $True Part of our ModernAuth config sets that to $false. Is the only way to use this tool and complete the OD4B migration to reset the tenantwide 'legacyauthprotocols' setting to '$true'? Otherwise, can SPO PS be used to connect to other tenants with appropriate creds, iterate through a source user's OneDrive files/folders, connect to the target tenant with separate creds and populate a corresponding OneDrive with that data? We only have about 40 users and I can use a CSV for source/target. This is a tough one. Thanks, John https://answers.microsoft.com/en-us/msoffice/forum/msoffice_o365admin-mso_manage/connect-sposervice-fails-unable-to-access/0aa6665b-c3d8-4138-92e4-6dfab2cbf0382.6KViews0likes3CommentsMigrationPermanentException
we are in Hybrid environment while doing the migration from cloud to on-premises its shows as as below Error: MigrationPermanentException: Mailbox dumpster size 36.46 GB (39,153,389,259 bytes) exceeds target quota 30 GB (32,212,254,720 bytes). --> Mailbox dumpster size 36.46 GB (39,153,389,259 bytes) exceeds target quota 30 GB (32,212,254,720 bytes). i have checked my on-premises database Storage Lmits as per my knowledge it was not issues with storage limits I'm Attaching the Storage Limit snap short, please check and let me know may be something wrong22KViews0likes1Comment