bi & data analysis
2480 TopicsComplex formulae with data model
Over the years, I have developed a significant excel data sheet which has gotten slower and slower with time. I have discovered the idea of excel's data model this week and have learnt how to import the data I need from its source CSV to the data model and I now have a power pivot which looks like what I originally had in excel. What I would like to do next, is to replicate some of the formulae I had in excel to calculate if something should be counted or not. I would not want the red one to be counted as this row would have been counted in the previous month. In excel I was using the following formula- =IF(COUNTIF(B5:M4,">0"),SUM(Patient!B5:M5*0),Patient!N5) is there a way of achieving this with the data model or would I have to hardcode this into another sheet? I was trying to avoid doing this to minimise the number of formulae... Many thanks for your help...65Views0likes3Commentsunpivot data and handle merged cells without using Power Query (Unpivot_Toolkit)
Hey, guys!! I’ve been working on a set of functional Excel Lambdas to solve a common headache: transforming "Wide" human-readable data into "Long" database formats without having to open Power Query every time. =UNPIVOT_PLUS(table, [no_of_cols], [no_of_headers], [attribute_names], [value_name], [remove_errors], [remove_blanks],[pad_blanks_with],[pad_errors_with]) Don’t worry about the full list, most use cases only require 5 arguments. I've included a table of default values at the end. Merged Cell Support: Automatically handles fill-down/fill-right logic for merged headers/columns. Bonus Helper: SPLIT_INJECT =SPLIT_INJECT(array, target_indices, delimiter, [ignore_empty], [match_mode], [pad_with]) It targets specific columns, splits them by a delimiter (like TEXTSPLIT), and expands the entire table horizontally while keeping all other columns perfectly stable. Optional arguments match TEXTSPLIT defaults. Feel free to tear this apart or adapt it for your own edge cases. I’d love to hear how you end up using it! You can grab both functions from my GitHub Gist https://gist.github.com/Medohh2120/f8553c149684e39bb499249e39f01017 File with use cases https://1drv.ms/x/c/6c310c8fd1669a94/IQCscpo3yh7jR5XdGZe2AQGyAf4-vCd8K6BvLZQgrP2V8Oo?e=gPdbOd Argument Description Default Behavior table The array or range of data to unpivot. Required argument (no default) [no_of_cols] Fixed left columns to keep as identifiers. 1 [no_of_headers] Top rows used as headers, handling merged cells. 1 [attribute_names] Header name for the unpivoted attributes . "Attribute" [value_name] Header name for the unpivoted values. "Value" [remove_errors] Excludes grid rows with formula errors. FALSE [remove_blanks] Removes grid empty cells and empty strings. TRUE [pad_blanks_with] Value to substitute for empty cells. Leaves cell blank [pad_errors_with] Value to substitute for errors. Leaves error as-is462Views2likes5CommentsExcel 365 (version 2307&higher) crashes when refreshing pivot tables (more pivot tables at once)
Dear all, I am currently experiencing the following problem in Excel 365 version 2308 with pivot tables (on MSAS connections): Excel always crashes if I'm refreshing data for all connections for more pivot tables at once - especially when I have 2 pivot tables with the same connection on 1 sheet . This happens always :o(. Excel will NOT crash under any of the following situation, 1. I have to change some filter on each! pivot table (without refreshing); 2. After that I can refresh pivot tables. I must do the same after opening Excel file next time. I have tested it on multiple machines and the symptoms are consistent over all the machines with Excel 365 version 2307&higher. Stable workarround is NEXT: 1. I must have PC with lower version of Excel (I have old notebook with Excel 365 version 1808 at home). 2. Change connections to local PC with local MSAS (I have no connection to original MSAS from local PC). 3. Open file & refresh all connections. 4. Save & copy to PC with Excel 365 version 2308 5. Open file & change connections to original MSAS 6. After that refresh works fine. Now this file will work in Excel 365 version 2308. Can you kindly advise if anything can be done to this issue? - I have many Excel files on version 2308. Thank you in advance.11KViews0likes6CommentsCorrelation Study with Filters
Dear Experts , I have a data like below( Attached worksheet) And want study the correlation between SNR[0~3], there will be 6 combinations as below:- I want make like this but with Filters/Slices for the "File.Name" & "SSB or TRS" & "Carrier Index", different colors for different Carrier Index(0 &1) Thanks & Regards Anupam ShrivastavaSolved146Views0likes2CommentsData Formatting
Dear Experts, I have a Data like below :- Each row can have 1 or more RNTIs( seperated by ",") and their DCI( UL_01 or UL_00), I want to put them in a format like in Col - I~P gave examples for the 1st two entries. Attached worksheet. Thanks & Regards Anupam ShrivastavaSolved194Views0likes5CommentsFilter Function or TAKE-DROP Function
Dear Experts, I have a Data like this:- Column A -> Has the File Names, and Column B,C,D have their corresponding data, In Column F I have the unique File Names and from G/H/I -> I need the start of the hfn/sfn/slot and in J/K/L the end of the hfn/sfn/slot for that File Name as populated , How to achieve this? Thanks & Regards Anupam ShrivastavaSolved308Views0likes8CommentsCleaning messy excel/csv form ERP system
Hi, I’m curious how people here usually deal with messy exports coming from ERP or accounting systems. I often receive Excel or CSV files that are not really ready to use as a table. Things like: -a few rows of system info before the header -totals or notes mixed inside the data -different date formats in the same column -currency symbols and formats all over the place -sometimes even another small table sitting under the first one Before doing any analysis (excel, power BI...) I usually have to spend a bit of time cleaning the file so it becomes a normal structured table. Power query definitely helps, but depending on the export I sometimes still end up doing several cleanup steps first. I’m curious what the usual workflow is for people here. Do you normally build a reusable Power Query transformation for these reports, or do you handle each export differently? I recently walked through one messy export example here while documenting the cleanup steps if anyone is curious about the kind of file I mean: https://www.mubsiraanalytics.com/blog/how-to-extract-clean-tables-from-messy-excel-csv Mostly just interested to see how others approach this.72Views0likes1CommentData Driven Analytics for Responsible Business Solutions, learning how to work with Power BI
Introduction In this blog post, we will be showcasing the project that we have worked on for the last couple of weeks. Here, we analysed a dataset using Power BI and its machine learning capabilities. For this, we were given the fictitious case of VenturaGear. The company was faced with the challenge of new competition, and it was our job to provide a data-driven insight into customer behaviour, feedback, and preferences. The objective was to support more effective customer targeting by identifying patterns and segments that could inform strategic decision-making, while ensuring ethical and responsible use of data. Before we jump into the course and our final results, we would like to introduce ourselves and the roles we had. Product Owner: Kylie Eggen Hello everyone! My name is Kylie, and I'm currently busy finishing my Master Responsible Digitalisation. During the DARBS course, I had the role of the product owner. This allowed me to develop a deeper understanding of both data analysis and the ethics of handling sensitive data. The course provides you with skills that could be useful in your future career, which is very nice. I liked the learning experience a lot and will definitely use it in the future! Kylie Eggen | LinkedIn Data Analyst: Ha Nguyen I am currently in the final stage of my Master’s degree in Responsible Digitalisation, focusing on the ethical and strategic use of data-driven technologies. With five years of experience using Excel for data analysis, I have developed a strong foundation in data handling and visualisation. This course allows me to expand my skills by learning to create interactive dashboards and generate actionable insights using Power BI. These competencies strengthen my ability to support responsible, data-driven decision-making in my future professional career. Ha Nguyen | LinkedIn Data Analyst: Rianne van Ee Hello! My name is Rianne, and I am currently in the process of completing my Master’s degree in Responsible Digitalisation. I chose this specialisation because I am very interested in new technologies and different perspectives. I am very interested in data analysis and learning about new software, so the DARBS course was very interesting to me. I am excited to apply my new skills in a professional environment. Rianne van Ee | LinkedIn Data Visualisation Consultant: Aya Torqui Hello! My name is Aya Torqui, and I am a Master’s student in Responsible Digitalisation at Radboud University. One of the reasons I chose this specialisation is my strong interest in how companies transform raw and sometimes ambiguous data into valuable business decisions. The DARBS course, therefore, provided the perfect opportunity for me to gain new and deeper insights into this process. In my role as a Data Visualisation Consultant, I developed new skills not only in designing visually attractive and interesting dashboards, but also in communicating a meaningful and coherent story through them. I am grateful for the opportunity to have developed these skills during the course, and I look forward to further broadening and strengthening them in my future career. Aya Torqui | LinkedIn Data Visualisation Consultant: Ting Yu Hi! My name is Ting Yu. I am currently a Master’s student of Civil Law and Responsible Digitalisation. I found the DARBS course quite interesting, and it was a whole new experience for me, because I learned that numbers are not boring. With a dashboard, it is possible to tell a story and help organisations. What I also really liked about this course was the creative side. Not only was it fun to play around with different charts and colour schemes for the dashboard, but also the video we had to make! I am curious to see what the future possibilities are. Ting Yu | LinkedIn Project Overview The goal of this project was to provide data-driven managerial recommendations to the fictitious company, VenturaGear. Eventually, it was our task to deliver a final report and a video blog in which we discussed their data and gave them recommendations on how to improve. Our focus was on supporting more effective customer targeting by identifying patterns and segments that could inform strategic decision-making. During the process, one of our main goals was to keep the data analysis responsible and ethical. Project Journey The course followed a nice structure, allowing us to learn about PowerBi gradually and expand our skills and knowledge over a couple of weeks. We started off by completing lab work. Every week we completed several online courses, and spent one lecture applying the knowledge from these courses in a lab work assignment. After a few weeks, we applied our knowledge in a milestone assignment. This was the first time we really applied our newfound skills in a practical manner. This was a really nice opportunity to see whether we could actually apply what we learned. This also came with a machine learning aspect. Even though we had a short introduction to the topic in class, none of us had worked with machine learning before. We were able to apply the knowledge we gathered about learning how to use a new system, like Power BI, on another system, in this case, machine learning. While we really struggled here at the start, after some time we figured it out and were able to work with the technology. This milestone assignment was the perfect preparation for the actual final assignment, which also had this machine learning aspect. We now knew where to start, what data to include, etc. We now also knew what to consider when looking at the ethical side of things. Like what information needs to be anonymised, or left out completely. Eventually, all our newfound knowledge was combined into making the final assignment and video blog. Technical Details Microsoft Power BI served as the main analytical environment throughout the project. We began by importing multiple CSV datasets into Power BI and preparing the data using Power Query. This involved cleaning duplicate records, correcting formatting inconsistencies, and transforming variables to ensure accurate calculations and reliable analysis. We then created a relational data model connecting key tables such as sales transactions, product information, customer behaviour, and sales reasons. Establishing these relationships allowed us to analyse data across multiple dimensions and generate deeper insights into customer activity and online purchasing patterns. Interactive dashboards were developed using Power BI’s visualisation tools, accessible colour themes, and slicers, allowing users to explore insights dynamically. Rather than presenting static results, the dashboard encouraged managers to interact with the data and investigate patterns independently. In addition to descriptive analytics, we applied a machine learning model (XGBoost) to identify factors influencing the sales of the top revenue-generating products. This introduced us to predictive analytics and highlighted the importance of feature selection, handling missing values, and critically interpreting model outputs. Combining visualisation with machine learning enabled us to move beyond reporting toward data-driven decision support. Results and Outcomes Before we could analyse our data, we ran into a few problems. Firstly, our unit prices seemed to be inflated in the dataset. The decimal was removed, leading to unreasonably high prices. To solve this, we recalculated the LineTotal, using the formula that can be seen below. Another problem we ran into was that we seemed to have a lot of missing data. We noticed this while looking at the sales reasons. A third of the data ended up blank. We ended up excluding the blank values, so that we were still able to analyse the remaining data. To really effectively target customers, we felt it was important to analyse the reasons people made their purchases. Through our analysis, we found that for VentureGear, the biggest contributor was price. We found that VenturaGear mainly made its sales in Australia. Lesson Learned Working with new systems The main lesson that we learned is how to start using a new system. The way in which we were taught how to use Power BI showed us a nice way of approaching new things. We believe this can be useful in other areas of our professional lives. 2. Data analysis Most of us were a little intimidated when we first heard that we were going to be analysing data through a new program. However, once we started, we noticed that when we all put our minds to it, it is quite manageable. We have all gained some understanding of data analysis and how to visualise this. 3. Teamwork A big factor during this project was teamwork. Our team was divided up into different roles. That meant that there was teamwork between the two data analysts and data visualisation consultants, but also between different roles. We found it to be really important to have teamwork between all these actors. We noticed that the further we got into the project, the smoother this interaction went. Collaboration and Teamwork On this project, we worked as a team. Our team consists of five people. Kylie Eggen was the Product Owner. Her role was to take care of the overview of the project. Ha Nguyen and Rianne van Ee were the Data Analysts for this project. Aya Torqui and Ting Yu were the Data Visualisation Consultants. We mostly stuck to our roles, but noticed that everything needed to happen in collaboration. So even though we were all mainly busy with our own roles, we were all involved in each other as well. We noticed this really helped in making the project a coherent whole. Future Development While this project generated valuable insights, there are several opportunities for further development. A potential next step would be integrating real-time data into Power BI. Expanding the dashboard with automated data refresh will allow managers to track performance continuously and respond more quickly to changing customer behaviour. Another area for future development involves extending the machine learning component. Rather than focusing only on identifying predictors of key revenue-generating products, the model could be expanded to include customer segmentation, such as grouping customers into categories like high-value customers, discount-sensitive buyers, or frequent online shoppers. In addition, the model could be developed further to support purchase prediction, enabling forecasts of seasonal demand, identifying customers likely to make repeat purchases, and determining which products are most preferred by specific customer groups. These enhancements would provide a more dynamic understanding of customer behaviour and support more targeted, data-driven decision-making. Incorporating more complete behavioural data or improving survey participation rates would also help reduce missing values and increase the reliability of insights. And finally, for future research, the organisation could consider introducing clear consent options on the web shop to help customers better understand what data is being collected. These options would also allow customers to choose what information they want to share, improving transparency and strengthening customer trust. Conclusion This project allowed us to learn how data analytics can help organisations make smarter and more responsible business decisions. Using Power BI, we transformed complex customer and sales data into clear, interactive insights that help managers better understand online behaviour, purchasing motivations, and performance trends. Beyond building technical skills, we also learned how important data quality, transparency, and ethical considerations are when working with sensitive customer data. Throughout the project, we discovered that data analysis is an iterative process that requires continuous evaluation, critical thinking, and careful interpretation of results. Most importantly, we realised that meaningful analytics is never an individual effort but a collaborative process, where teamwork and shared problem-solving play a key role in turning data into valuable insights. Overall, this project strengthened our ability to bridge technical analytics with responsible digitalisation principles. By combining business understanding, visualisation skills, and ethical awareness, we gained a clearer perspective on how tools like Power BI can enable professionals to create meaningful, data-driven solutions that are both impactful and responsible. Call to Action After experiencing this learning journey, we encourage you to engage with tools such as Power BI. As our teacher told us, ‘‘You are going to hit a wall.’’ That is exactly what happened to us, but pushing through those moments allowed us to create a deeper understanding and develop new skills. At the same time, we tried to stay aware of the ethical implications of working with data. During the project, we always ensured to stay transparent and responsible in our analysis. We encourage you to challenge yourself! Experiment with new technologies and step outside of your comfort zone. What we also think you should remember is that a strong analysis is not only dependent on technical skills, but it is also about staying transparent, responsible, and trustworthy. On behalf of group 3, thank you for taking the time to read our summary. Wehope it has been useful. Feel free to reach out for any remaining questions!
116Views1like0CommentsConditional Formatting or a Specific Filter Rule
Dear Experts , I have a data like below:- here the Column-O, txNumber can go from 1,2,3 (txNum=2 and 3 means a Retransmission), I want to filter all the pairs for all Transmissions and their corresponding Retransmissions ( either by coloring them all using a conditional formatting) or using a specific filter Rule. How to identify the Transmission and it's corresponding Retransmission for a Specific HarqId( say for dlHarqProcessIndex ==7, the blue color Row-37 is my 1st transmission(txNumber ==1) for the dlHarqProcessIndex==7 with a tbSize of 852696, and adaptRetxStatus == NOTACTIVATE, and it's corresponding 2nd transmission( Retransmission , txNumber==2) Row-45, tbSize remains same(as it's corresponding 1st transmission) in all retransmissions and adaptRetxStatus changes to "APPLY" for all retrans txNumber==2,3) and similarly for other pairs like for dlHarqProcessIndex==13. Attached is the Worksheet.Solved199Views0likes5Comments