Introduction: Power BI Paginated Reports are ideal for creating highly formatted, pixel-perfect layouts optimized for printing or PDF generation. By integrating OpenAI with Power BI Paginated Reports using Azure Functions, you can enhance your reports with AI-generated insights and content. This blog post provides a step-by-step guide on how to integrate OpenAI with Power BI Paginated Reports using Azure Functions and an intermediary SQL Server database.
Prerequisites:
An OpenAI API key
An Azure account
Power BI Report Builder
Basic knowledge of Power BI Paginated Reports, Azure Functions, and C#
Step 1: Create a SQL Server database
Set up a SQL Server database or use an existing one.
Create a new table to store the AI-generated content:
CREATE TABLE OpenAI_Responses ( ID INT PRIMARY KEY IDENTITY(1,1), Prompt NVARCHAR(MAX), GeneratedText NVARCHAR(MAX), DateGenerated DATETIME );
Step 2: Create an Azure Function to call the OpenAI API and store the AI-generated content in the SQL Server database
Set up an Azure Function App with an HTTP trigger and follow the instructions to create a new function.
Add the necessary NuGet packages to call the OpenAI API (e.g., OpenAI) and connect to SQL Server (e.g., System.Data.SqlClient).
Modify the Azure Function code to call the OpenAI API, and insert the AI-generated content into the SQL Server table.
using System.Data.SqlClient; using System.IO; using System.Threading.Tasks; using Microsoft.AspNetCore.Mvc; using Microsoft.Azure.WebJobs; using Microsoft.Azure.WebJobs.Extensions.Http; using Microsoft.AspNetCore.Http; using Microsoft.Extensions.Logging; using Newtonsoft.Json; using OpenAI;
public static class OpenAIIntegrationFunction { [FunctionName("OpenAIIntegrationFunction")] public static async Task<IActionResult> Run( [HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req, ILogger log) { log.LogInformation("C# HTTP trigger function processed a request.");
// Get the 'prompt' parameter from the query string string prompt = req.Query["prompt"];
// Use OpenAI API Key string openaiApiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY");
// Initialize the OpenAI API client var apiClient = new OpenAIApiClient(apiKey: openaiApiKey);
// Set up the completion request var completions = await apiClient.Completions.CreateAsync( engine: "text-davinci-002", new CompletionRequest { Prompt = prompt, MaxTokens = 50, N = 1, Stop = null, Temperature = 0.7, } );
// Replace with your SQL Server connection string string connectionString = "your_sql_server_connection_string";
using (SqlConnection connection = new SqlConnection(connectionString)) { connection.Open(); using (SqlCommand command = new SqlCommand("INSERT INTO OpenAI_Responses (Prompt, GeneratedText, DateGenerated) VALUES (@Prompt, @GeneratedText, @DateGenerated)", connection)) { command.Parameters.AddWithValue("@Prompt", prompt); command.Parameters.AddWithValue("@GeneratedText", generated_text); command.Parameters.AddWithValue("@DateGenerated", DateTime.UtcNow);
await command.ExecuteNonQueryAsync(); } }
return new OkObjectResult("Data saved to the database."); } }
Copy and paste this code into your Azure Function App. Replace the your_sql_server_connection_string placeholder with your actual SQL Server connection string. This code assumes you have already set up an OpenAI API key as an environment variable within your Azure Function App.
Save the function and test it to ensure it inserts the AI-generated content into the SQL Server table.
Step 3: Create a Power BI Paginated Report
Open Power BI Report Builder.
Create a new report or open an existing one.
Add a new “Parameter” to the report:
In the “Report Data” pane, right-click “Parameters” and click “Add Parameter.”
Name the parameter “Prompt.”
Set the data type to “Text.”
Provide a default value or leave it blank.
Add a “Textbox” to the report and set its value to the “Prompt” parameter: =Parameters!Prompt.Value
Step 4: Connect the Power BI Paginated Report to the SQL Server database
In the “Report Data” pane, right-click “Data Sources” and click “Add Data Source.”
Choose “Microsoft SQL Server” as the connection type and provide a name for the data source.
In the “Connection string” field, enter your SQL Server connection string.
Click “OK” to add the data source.
In the “Report Data” pane, right-click “Datasets” and click “Add Dataset.”
Choose the SQL Server data source you just created and click “Query Designer.”
In the “Query Designer,” enter a SQL query to fetch the latest AI-generated content for the given prompt:
SELECT TOP 1 GeneratedText FROM OpenAI_Responses WHERE Prompt = @Prompt ORDER BY DateGenerated DESC
8. Add the “Prompt” parameter to the query by clicking
“Add Parameter” in the “Query Designer.” 9. Close the “Query Designer” and click “OK” to add the dataset.
Add a “Textbox” to the report and set its value to the AI-generated text: =First(Fields!GeneratedText.Value, "Dataset1")
Conclusion: You now have a Power BI Paginated Report that displays AI-generated content based on the prompt parameter. When the report is run, it will retrieve the latest AI-generated content for the given prompt from the SQL Server database and display it in the report. To update the AI-generated content in the SQL Server database, you can manually call the Azure Function with the specified prompt, or you can create a separate application to automate this process. The Azure Function will then call the OpenAI API, generate the text, and insert it into the SQL Server table.
This approach allows you to leverage the Power BI Paginated Report’s native support for SQL Server as a data source while still incorporating AI-generated content from the OpenAI API. It involves additional steps and requires an intermediary database, but it provides a viable solution for integrating OpenAI with Power BI Paginated Reports
This blogpost was created with help from ChatGPT Proand Paginated Report Bear.
Introduction: Power BI is a powerful business analytics tool that allows you to visualize and analyze data. But did you know you can also use it to create an interactive poker game? In this detailed walkthrough, we’ll show you how to design a poker game within a Power BI report. This tutorial is designed for novices, so we’ll go step-by-step through the process.
Pre-requisites:
Power BI Desktop installed on your computer.
A basic understanding of Power BI and its features.
A dataset containing poker card images (PNG or JPEG format) and associated data (card rank and suit).
Step 1: Prepare the dataset Before we begin building our poker game, we need to prepare the dataset. The dataset should contain the following columns:
CardID: A unique identifier for each card.
CardName: The name of the card (e.g., Ace of Spades, Two of Hearts, etc.).
CardImage: The file path or URL to the card image.
Rank: The rank of the card (e.g., Ace, 2, 3, etc.).
Suit: The suit of the card (e.g., Spades, Hearts, etc.).
You can create this dataset using Excel or any other spreadsheet software. Save the dataset in CSV or Excel format.
Step 2: Import the dataset into Power BI
Open Power BI Desktop.
Click on “Home” and then click “Get Data.”
Choose “Excel” or “Text/CSV” depending on the format of your dataset.
Browse to the location of your dataset and click “Open.”
In the Navigator window, select the sheet or table containing your dataset and click “Load.”
Step 3: Create a card deck table
In the “Data” view, click on the ellipsis (three dots) next to your dataset’s name and click “Reference.”
Rename the new table as “CardDeck.”
Add a new column called “IsDrawn” with a default value of 0 (indicating the card is not yet drawn).
Click “Close & Apply” to save your changes.
Step 4: Create the poker table layout
Switch to the “Report” view.
Add a new page and rename it as “Poker Table.”
Drag and drop a “Slicer” visual onto the canvas and set its “Field” to “CardDeck[CardID].”
Set the slicer visual to “Single select” mode and hide its header.
Arrange the slicer visual to resemble a deck of cards (you can customize its appearance using the “Format” pane).
Add a “Gallery” visual (custom visual available in the marketplace) to the canvas and set its “Field” to “CardDeck[CardImage].”
Set the gallery visual’s “Filters on this visual” to show only cards with “CardDeck[IsDrawn]” equal to 1.
Step 5: Create the draw card functionality
Add a “Button” visual to the canvas and label it “Draw Card.”
Under the “Action” tab in the “Format” pane, set the “Action type” to “Bookmark.”
Go to the “View” tab and click “Bookmarks.”
Click “Add” to create a new bookmark and rename it “DrawCard.”
With the “DrawCard” bookmark selected, go to the “Data” tab and change the “CardDeck[IsDrawn]” value for the selected card to 1.
Click “Update” in the “Bookmarks” pane to save your changes. 7. In the “Format” pane, set the button’s “Action” to the “DrawCard” bookmark.
Step 6: Create the reset deck functionality
Add another “Button” visual to the canvas and label it “Reset Deck.”
Under the “Action” tab in the “Format” pane, set the “Action type” to “Bookmark.”
With the “Bookmarks” pane still open, click “Add” to create a new bookmark and rename it “ResetDeck.”
With the “ResetDeck” bookmark selected, go to the “Data” tab and change the “CardDeck[IsDrawn]” value for all cards back to 0.
Click “Update” in the “Bookmarks” pane to save your changes.
In the “Format” pane, set the reset button’s “Action” to the “ResetDeck” bookmark.
Step 7: Add player areas and card placeholders
On the “Poker Table” page, create a section for each player (up to the desired number of players).
Add a “Card” visual for each card placeholder in the player’s area.
Set the “Category” field to “CardDeck[CardName]” and the “Image” field to “CardDeck[CardImage]” for each card visual.
Apply filters to each card visual to show the corresponding card based on the draw order and player.
Step 8: Add game rules and logic (optional)
Adding game rules and logic to your Power BI poker game will require the use of DAX (Data Analysis Expressions) language and additional visuals. In this expanded step, we’ll cover the basics of hand ranking, betting, and winning conditions.
A. Create hand ranking and scoring measures
In the “Data” view, create a new table named “HandRanking” with the following columns: “HandRank”, “HandName”, and “HandDescription”.
Populate the table with standard poker hand rankings, such as High Card, One Pair, Two Pair, Three of a Kind, etc.
In the “CardDeck” table, create a new calculated column called “CardValue” using the following DAX formula to assign a numeric value to each card based on its rank:
4. Create a measure named “PlayerHandScore” in the “CardDeck” table using a DAX formula that calculates the poker hand score for each player based on their drawn cards.
Creating a “PlayerHandScore” measure in DAX to evaluate poker hands is a complex task due to the intricate rules of poker hand rankings. It’s important to note that this measure would require advanced DAX calculations, and the provided solution here will be a simplified version that may not cover all edge cases.
For this example, let’s assume we have a table “PlayerCards” containing the “PlayerID” and the “CardID” of their drawn cards. Here’s a basic outline of a possible DAX formula to calculate a simplified “PlayerHandScore” measure:
Calculate the count of each rank and suit for each player.
Use these counts to evaluate different poker hand types.
Assign a numeric score to each hand type.
Here’s a simplified DAX measure for “PlayerHandScore”:
In the “Report” view, create a new page named “Scoreboard”.
Add a “Table” visual to the canvas.
Drag the “Player”, “HandName”, and “PlayerHandScore” fields to the “Values” section of the table visual.
Sort the table by “PlayerHandScore” in descending order.
C. Create betting logic
Add a new table named “Player” with the following columns: “PlayerID”, “PlayerName”, and “PlayerBalance”.
In the “Player” table, create a calculated column named “PlayerBet” to store the bet amount for each player.
Add a “Chiclet Slicer” custom visual (available in the marketplace) to the “Poker Table” page, setting its “Field” to “Player[PlayerID]”. Set the chiclet slicer visual to “Single select” mode.
Add another “Chiclet Slicer” custom visual to the canvas to allow players to input their bet amounts. Set the visual’s “Field” to “Player[PlayerBet]”. Configure the chiclet slicer’s format and appearance to display bet amounts clearly.
Create a calculated column named “UpdatePlayerBalance” in the “Player” table using a DAX formula that subtracts the bet amount from the player’s balance when a bet is placed.
This calculated column will subtract the bet amount (Player[PlayerBet]) from the player’s balance (Player[PlayerBalance]) for each player, updating their balance accordingly. Please note that this DAX expression assumes that the “PlayerBet” column is a static input in the “Player” table. If the bet amount changes dynamically during the game, you may need to use more advanced DAX technique.
The more advanced technique for updating player balances in real-time involves using measures, buttons, and bookmarks. Here’s a step-by-step guide on how to implement this method:
In the “Player” table, create a measure called “SelectedPlayer” to identify the currently selected player:
SelectedPlayer = SELECTEDVALUE(Player[PlayerID])
Create another measure called “SelectedBet” to identify the currently selected bet amount:
SelectedBet = SELECTEDVALUE(Player[PlayerBet])
Create a measure called “UpdatePlayerBalance” in the “Player” table. This measure will display the updated balance after placing a bet:
Add a “Button” visual to the “Poker Table” page and label it “Place Bet”. Set the button’s “Action type” to “Bookmark”.
Go to the “View” tab and click “Bookmarks”. Click “Add” to create a new bookmark and rename it “PlaceBet”.
With the “PlaceBet” bookmark selected, click “Add” in the “Selection” pane. Select the “Player” table and the “Chiclet Slicer” for the bet amount. Set the “Data” property for the selected player and bet amount.
Click “Update” in the “Bookmarks” pane to save your changes.
In the “Format” pane, set the “Place Bet” button’s “Action” to the “PlaceBet” bookmark.
Add a “Card” visual to the “Poker Table” page to display the updated player balance. Set the “Field” to the “UpdatePlayerBalance” measure.
Now, when a player selects their bet amount and clicks the “Place Bet” button, the player’s balance will be updated in real-time. Note that this implementation requires users to click the “Place Bet” button to update the balance. You can further refine the solution to make it more interactive and user-friendly by incorporating additional buttons, visuals, and DAX expressions.
D. Determine the winner
Create a measure named “Winner” in the “Player” table using a DAX formula that identifies the player with the highest “PlayerHandScore”. To create a measure named “Winner” in the “Player” table that identifies the player with the highest “PlayerHandScore”, you can use the following DAX formula:
Winner =
VAR MaxPlayerHandScore = MAXX(ALL(Player), [PlayerHandScore])
VAR WinningPlayerID =
CALCULATE (
SELECTEDVALUE ( Player[PlayerID] ),
FILTER ( ALL ( Player ), [PlayerHandScore] = MaxPlayerHandScore )
)
VAR WinningPlayerName = LOOKUPVALUE ( Player[PlayerName], Player[PlayerID], WinningPlayerID )
RETURN
WinningPlayerName
In this formula, we first calculate the maximum “PlayerHandScore” across all players using the MAXX function. Then, we identify the winning player’s ID by filtering the “Player” table for the player with the maximum hand score. Finally, we use the LOOKUPVALUE function to return the name of the winning player based on the winning player’s ID.
2. Add a “Card” visual to the “Poker Table” page and set its “Category” field to “Winner”. This visual will display the name of the winning player.
E. Add reset game functionality
Follow the steps outlined in Step 6 to create a “Reset Game” button and bookmark.
With the “ResetGame” bookmark selected, reset the “PlayerHandScore”, “PlayerBet”, and “CardDeck[IsDrawn]” values for all players and cards back to their initial values.
Update the “ResetGame” bookmark to save your changes.
Step 9: Publish and share the report
Save your Power BI report by clicking “File” > “Save.”
To share your interactive poker game with others, click “File” > “Publish” > “Publish to Power BI.”
Sign in to your Power BI account and choose a destination workspace.
Once published, you can share the report link or embed it in a web page.
Congratulations! You’ve successfully created an interactive poker game in Power BI. By following these steps, you’ve learned how to import and manipulate data, create a visually appealing layout, and implement game functionality using bookmarks and actions. You can further customize your poker game by adding rules, logic, and additional visuals as needed. Happy playing!
This blogpost was created with help from ChatGPT Pro
In today’s data-driven world, organizations need to harness the power of data analytics to make informed decisions, drive growth, and stay competitive. Microsoft offers two powerful tools for data analytics: Power BI and Azure Synapse. Both platforms have unique strengths and capabilities, making it essential to understand their differences and select the right tool for your data analytics needs. In this blog post, we will provide a comprehensive comparison of Power BI and Azure Synapse, discussing their features, use cases, and how they can work together to provide an end-to-end data analytics solution.
Power BI: An Overview
Power BI is a suite of business analytics tools that enables users to connect to various data sources, visualize and analyze data, and share insights through interactive reports and dashboards. It caters to both technical and non-technical users, providing a user-friendly interface and an extensive library of visualizations.
Key Features of Power BI:
Data Connectivity: Power BI supports a wide range of data sources, including relational databases, NoSQL databases, cloud-based services, and file-based sources.
Data Modeling: Users can create relationships, hierarchies, and measures using Power BI’s data modeling capabilities.
Data Visualization: Power BI offers numerous built-in visuals and the ability to create custom visuals using the open-source community or by developing them in-house.
DAX (Data Analysis Expressions): DAX is a powerful formula language used to create calculated columns and measures in Power BI.
Collaboration and Sharing: Power BI allows users to share reports and dashboards within their organization or embed them into applications.
Azure Synapse: An Overview
Azure Synapse Analytics is an integrated analytics service that brings together big data and data warehousing. It enables users to ingest, prepare, manage, and serve data for immediate business intelligence and machine learning needs. Azure Synapse provides a scalable and secure data warehouse, offering both serverless and provisioned resources for data processing.
Key Features of Azure Synapse:
Data Ingestion: Azure Synapse supports various data ingestion methods, including batch and real-time processing.
Data Transformation: Users can perform data cleaning, transformation, and enrichment using Azure Synapse’s data flow and data lake integration capabilities.
Data Storage: Azure Synapse provides a fully managed, secure, and scalable data warehouse that supports both relational and non-relational data.
Data Processing: Users can execute large-scale data processing tasks with serverless or provisioned SQL pools and Apache Spark pools.
Machine Learning: Azure Synapse integrates with Azure Machine Learning, allowing users to build, train, and deploy machine learning models using their data.
Choosing the Right Tool: Power BI vs. Azure Synapse
While Power BI and Azure Synapse have some overlapping features, they serve different purposes in the data analytics ecosystem. Here’s a quick comparison to help you choose the right tool for your needs:
Data Analysis and Visualization: Power BI is the ideal choice for data analysis and visualization, offering user-friendly tools for creating interactive reports and dashboards. Azure Synapse is primarily a data storage and processing platform, with limited visualization capabilities.
Data Processing and Transformation: Azure Synapse excels at large-scale data processing and transformation, making it suitable for big data and complex ETL tasks. Power BI has some data preparation capabilities but is best suited for smaller datasets and simple transformations.
Data Storage: Azure Synapse provides a scalable and secure data warehouse for storing large volumes of structured and unstructured data. Power BI is not designed for data storage; it connects to external data sources for analysis.
Machine Learning: Azure Synapse’s integration with Azure Machine Learning makes it the preferred choice for organizations looking to build, train, and deploy machine learning models. Power BI offers some basic machine learning capabilities through the integration of Azure ML and R/Python scripts but is not as comprehensive as Azure Synapse.
Scalability: Azure Synapse is designed to handle massive datasets and workloads, offering a scalable solution for data storage and processing. Power BI, on the other hand, is more suitable for small to medium-sized datasets and may face performance issues with large volumes of data.
User Skill Set: Power BI caters to both technical and non-technical users, offering a user-friendly interface for creating reports and dashboards. Azure Synapse is primarily geared towards data engineers, data scientists, and developers who require a more advanced platform for data processing and analytics.
Leveraging Power BI and Azure Synapse Together
Power BI and Azure Synapse can work together to provide an end-to-end data analytics solution. Azure Synapse can be used for data ingestion, transformation, storage, and processing, while Power BI can be used for data visualization and analysis. By integrating the two platforms, organizations can achieve a seamless data analytics workflow, from raw data to actionable insights.
Here’s how you can integrate Power BI and Azure Synapse:
Connect Power BI to Azure Synapse: Power BI can connect directly to Azure Synapse, allowing users to access and visualize data stored in the Synapse workspace.
Use Azure Synapse Data Flows for Data Preparation: Azure Synapse Data Flows can be used to clean, transform, and enrich data before visualizing it in Power BI.
Leverage Power BI Dataflows with Azure Synapse: Power BI Dataflows can be used in conjunction with Azure Synapse, storing the output of data preparation tasks in Azure Data Lake Storage Gen2 for further analysis.
Power BI and Azure Synapse are both powerful data analytics tools, but they cater to different needs and use cases. Power BI is best suited for data analysis, visualization, and sharing insights through interactive reports and dashboards, while Azure Synapse excels at large-scale data processing, storage, and machine learning.
To maximize the potential of your data analytics efforts, consider leveraging both tools in tandem. By integrating Power BI and Azure Synapse, you can create a comprehensive, end-to-end data analytics solution that covers all aspects of the analytics workflow, from raw data to actionable insights.
This blogpost was created with help from ChatGPT Pro.
It’s been over a decade since Dragon Age 2 (DA2) first graced our gaming screens, and it still remains one of the most divisive titles in BioWare’s storied catalog. While many fans were initially disappointed with the game, citing a streamlined narrative and repetitive environments, it’s time to revisit this underrated gem and give it the credit it truly deserves. Today, we’ll dive deep into the world of DA2, exploring its unique storytelling, unforgettable characters, and much-needed innovation that makes it worthy of your gaming time.
A Bold New Take on Storytelling
Dragon Age 2 made a daring choice to depart from the grand, world-saving storyline of its predecessor, Dragon Age: Origins. Instead, it opted for a more personal, focused narrative that spanned a decade in the life of protagonist Hawke. This narrative structure allowed the game to explore themes of family, friendship, and the impact of one’s choices on the world around them.
The game’s unique framing device, with the story being retold by Varric, gives it an intriguing and intimate feel. This approach adds depth and nuance to the storytelling, as players must consider how Varric’s perspective may have shaped the events recounted in the game.
Unforgettable Characters
DA2 introduces a cast of memorable, well-developed companions, each with their own personal stories, motivations, and character arcs. From the fiery mage Anders to the stoic warrior Fenris, the game’s roster of companions is incredibly diverse and engaging. Friendships and rivalries can form between these characters, and the choices you make in the game will have a lasting impact on their relationships and development.
The game also sees the return of a few fan favorites, including the lovable dwarf Varric and the enigmatic Isabela. Their stories are expanded upon in DA2, giving players an opportunity to learn more about their backgrounds and what drives them.
Challenging Choices with Consequences
One of the hallmarks of BioWare games is the ability to make difficult choices that have consequences, and Dragon Age 2 is no exception. The game’s central conflict between mages and templars is morally complex, and there are no easy answers or clear-cut paths to follow. This forces players to grapple with their decisions and consider the long-term implications of their actions.
A Streamlined Combat System
While some critics argue that the simplified combat system in DA2 detracts from the strategic depth of its predecessor, it’s important to recognize the benefits of the streamlined approach. The faster-paced, more action-oriented combat makes for a more engaging and dynamic experience, which keeps players on their toes and adds excitement to each encounter.
Art Style and Visuals
Dragon Age 2 boasts a unique and striking art style that sets it apart from other games in the genre. The use of bold colors and stylized character designs give the game a distinct aesthetic that is both beautiful and memorable. The game’s visual storytelling is further enhanced by cinematic camera angles and expressive character animations that truly bring the world of Thedas to life.
Conclusion
It’s high time we give Dragon Age 2 the appreciation it deserves. While it may not be a perfect game, it is undoubtedly an underrated gem that dared to innovate and take risks in a genre often rife with clichés. The game’s unique storytelling, engaging characters, and streamlined combat make it a must-play experience for fans of the Dragon Age series and RPGs alike. So, dust off that old copy of DA2 or pick one up on sale, and embark on a thrilling adventure through the streets of Kirkwall – it time to embrace the dragon once more.
In the years since its release, Dragon Age 2 has gained a dedicated and passionate fanbase that recognizes the game’s merits and strengths. As you explore the Free Marches, you’ll find that the game’s intimate and personal narrative resonates deeply, and the camaraderie between Hawke and their companions is a powerful force that drives the story forward.
Additionally, with Dragon Age 4 on the horizon, revisiting DA2 can provide valuable insight and context for the continuing story in the Dragon Age universe. Hawke’s journey and the choices you make in Dragon Age 2 may have a significant impact on the world of Thedas as we venture forth into the next chapter of the saga.
So, take a chance on this underrated gem, and you may just discover a new appreciation for a game that dared to be different. With its rich storytelling, captivating characters, and compelling choices, Dragon Age 2 is an experience that’s worth revisiting or exploring for the first time. As Varric would say, “Everyone has a story to tell – and sometimes the story is the best part.”
This blogpost was created with help from ChatGPT Pro.
Automated data storytelling is a powerful way to transform complex data visualizations into meaningful narratives. By leveraging OpenAI’s natural language generation capabilities, you can create engaging and informative stories based on your Power BI data visualizations. In this blog post, we’ll discuss the importance of automated data storytelling and guide you through the process of using OpenAI to generate narratives and summaries for your Power BI reports.
Note – Please be aware that this solution involves interacting with OpenAI’s API. I encourage users to familiarize themselves with OpenAI’s data usage policy (https://platform.openai.com/docs/data-usage-policy) and take necessary precautions to ensure the privacy and security of their data.
The Importance of Automated Data Storytelling
Data visualizations in Power BI enable users to analyze and gain insights from their data. However, interpreting these visualizations can be challenging, especially for users without a background in data analysis. Automated data storytelling bridges this gap by:
Making data insights accessible: Narratives help users understand the context and significance of the data, making insights more accessible to a broader audience.
Enhancing decision-making: Clear and concise narratives can help users grasp the implications of the data, leading to better-informed decisions.
Saving time and resources: Generating data stories automatically reduces the time and effort required to create manual reports and analyses.
Prerequisites and Setup
Before we begin, you’ll need the following:
Power BI data visualizations: Ensure that you have a Power BI report or dashboard with data visualizations that you’d like to generate narratives for.
OpenAI API key: Sign up for an OpenAI API key if you haven’t already. You’ll use this to access OpenAI’s natural language generation capabilities. Visit https://beta.openai.com/signup/ to sign up.
A development environment: You can use any programming language and environment that supports HTTP requests. For this tutorial, we’ll use Python and the requests library.
Accessing Power BI Data
In order to generate narratives based on your Power BI data visualizations, you’ll first need to extract the data from the visualizations. You can do this using the Power BI API. Follow the instructions in the “Accessing Power BI Data” section of our previous blog post on creating a Power BI chatbot to set up the necessary API access and create a Python function to query the Power BI API: https://christopherfinlan.com/?p=1921
Generating Narratives with OpenAI
Once you have access to your Power BI data, you can use OpenAI’s API to generate narratives based on the data. Create a Python function to send data to the OpenAI API, as demonstrated in the “Building the Chatbot with OpenAI” section of our previous blog post: https://christopherfinlan.com/?p=1921
Crafting Data Stories
To create data stories, you’ll need to design prompts for the OpenAI API that effectively convey the context and purpose of the data visualizations. The prompts should include relevant data points, visualization types, and any specific insights you’d like the narrative to highlight. Here’s an example of a prompt for a sales report:
openai_prompt = f"""
Create a narrative based on the following sales data visualization:
- Data: {sales_data}
- Visualization type: Bar chart
- Time period: Last 12 months
- Key insights: Top 3 products, monthly growth rate, and seasonal trends
"""
narrative = chat_with_openai(openai_prompt)
Remember to replace {sales_data} with the actual data you’ve extracted from your Power BI visualization.
Integrating Narratives into Power BI Reports
With the generated narratives, you can enhance your Power BI reports by embedding the narratives as text boxes or tooltips. Although Power BI doesn’t currently support direct integration with OpenAI, you can use the following workaround:
Manually copy the generated narrative and paste it into a text box or tooltip within your Power BI report.
For a more automated approach, you can build a custom web application that combines both Power BI data visualizations and generated narratives. To achieve this, follow these steps:
Embed Power BI visuals using Power BI Embedded: Power BI Embedded allows you to integrate Power BI visuals into custom web applications. Follow the official documentation to learn how to embed Power BI reports and dashboards in your web application: https://docs.microsoft.com/en-us/power-bi/developer/embedded/embedding
Create a web application with a user interface: Design a user interface for your web application that displays Power BI visuals alongside the generated narratives. You can use HTML, CSS, and JavaScript to create the user interface.
Fetch narratives using JavaScript: When a user interacts with your Power BI visuals or requests a narrative, use JavaScript to send a request to your OpenAI-powered Python backend. The backend should return the generated narrative, which can then be displayed in your web application.
Here’s a simple example using JavaScript to fetch a narrative from your Python backend:
Remember to replace yourData with the data you’ve extracted from your Power BI visualization.
Conclusion
Automated data storytelling enhances the value of Power BI data visualizations by providing users with engaging narratives that help them better understand their data. By leveraging OpenAI’s natural language generation capabilities, you can automatically generate insightful narratives and summaries based on your Power BI visuals. Although direct integration with Power BI is not currently available, you can still utilize OpenAI-generated narratives in your reports or create custom web applications to combine Power BI visuals with automated storytelling.
This blogpost was created with help from ChatGPT Pro.
Azure Synapse Analytics is an integrated analytics service that brings together big data and data warehousing. It offers an effective way to ingest, process, and analyze massive amounts of structured and unstructured data. One of the core components of Azure Synapse Analytics is the Spark engine, which enables distributed data processing at scale. In this blog post, we will delve into the best practices for managing and monitoring Spark workloads in Azure Synapse Analytics.
Properly configure Spark clusters:
Azure Synapse Analytics offers managed Spark clusters that can be configured based on workload requirements. To optimize performance, ensure you:
Choose the right VM size for your Spark cluster, considering factors like CPU, memory, and storage.
Configure the number of nodes in the cluster based on the scale of your workload.
Use auto-pause and auto-scale features to optimize resource usage and reduce costs.
Optimize data partitioning:
Data partitioning is crucial for efficiently distributing data across Spark tasks. To optimize partitioning:
Choose an appropriate partitioning key, based on data distribution and query patterns.
Avoid data skew by ensuring that partitions are evenly sized.
Use adaptive query execution to enable dynamic partitioning adjustments during query execution.
Leverage caching:
Caching is an effective strategy for optimizing iterative or repeated Spark workloads. To leverage caching:
Cache intermediate datasets to avoid recomputing expensive transformations.
Use the ‘unpersist()’ method to free memory when cached data is no longer needed.
Monitor cache usage and adjust the storage level as needed.
Monitor Spark workloads:
Azure Synapse Analytics provides various monitoring tools to track Spark workload performance:
Use Synapse Studio for real-time monitoring and visualization of Spark job execution.
Leverage Azure Monitor for gathering metrics and setting up alerts.
Analyze Spark application logs for insights into potential performance bottlenecks.
Optimize Spark SQL:
To optimize Spark SQL performance:
Use the ‘EXPLAIN’ command to understand query execution plans and identify potential optimizations.
Leverage Spark’s built-in cost-based optimizer (CBO) to improve query execution.
Use data partitioning and bucketing techniques to reduce data shuffling.
Use Delta Lake for reliable data storage:
Delta Lake is an open-source storage layer that brings ACID transactions and scalable metadata handling to Spark. Using Delta Lake can help:
Improve data reliability and consistency with transactional operations.
Enhance query performance by leveraging Delta Lake’s optimized file layout and indexing capabilities.
Simplify data management with features like schema evolution and time-travel queries.
Optimize data ingestion:
To optimize data ingestion in Azure Synapse Analytics:
Use Azure Data Factory or Azure Logic Apps for orchestrating and automating data ingestion pipelines.
Leverage PolyBase for efficient data loading from external sources into Synapse Analytics.
Use the COPY statement to efficiently ingest large volumes of data.
Conclusion:
Managing and monitoring Spark workloads in Azure Synapse Analytics is essential for ensuring optimal performance and resource utilization. By following the best practices outlined in this blog post, you can optimize your Spark applications and extract valuable insights from your data.
This blogpost was created with help from ChatGPT Pro.
The integration of the OpenAI API with Microsoft Excel is revolutionizing the way businesses interact with their data. One of the most exciting developments is the creation of Excel-based chatbots, which have the potential to automate various functions such as:
Customer Service: Automate customer support by creating a chatbot that can handle frequently asked questions, troubleshoot issues, and provide guidance on product usage.
Sales: Develop a sales chatbot that can assist customers in finding the right product, offer personalized recommendations, and even process orders directly from Excel.
HR and Recruitment: Design a chatbot to automate the screening process, answer candidate queries, and schedule interviews.
Inventory Management: Build a chatbot to help users track inventory levels, place orders for restocking, and provide real-time updates on product availability.
Knowledge Management: Create a chatbot that can quickly retrieve information from internal databases, streamlining the process of accessing company data.
In this blog post, we will provide a detailed, step-by-step walkthrough for creating a chatbot within Excel using the OpenAI API. This guide is designed for novice Excel users, with easy-to-follow instructions and ready-to-use code snippets.
Note – Please be aware that this solution involves interacting with OpenAI’s API. I encourage users to familiarize themselves with OpenAI’s data usage policy (https://platform.openai.com/docs/data-usage-policy) and take necessary precautions to ensure the privacy and security of their data.
Step 1: Setting up the OpenAI API
First, sign up for an API key on the OpenAI website (https://beta.openai.com/signup/) and follow the instructions to get started.
Step 2: Creating a Chatbot Interface in Excel
Open a new Excel workbook.
In cell A1, type “User Input”.
In cell B1, type “Chatbot Response”.
Format the cells as desired for better readability.
Step 3: Enable Excel Developer Tab
Click on “File” > “Options” > “Customize Ribbon”.
Check the box next to “Developer” in the right column and click “OK”.
Step 4: Add a Button to Trigger Chatbot
Click the “Developer” tab.
Click “Insert” and select “Button (Form Control)”.
Draw the button below the “User Input” and “Chatbot Response” cells.
Right-click the button, select “Edit Text,” and type “Submit”.
Step 5: Write VBA Code for OpenAI API Integration
Right-click the “Submit” button and click “Assign Macro”.
In the “Assign Macro” window, click “New”.
Copy and paste the following code into the VBA editor:
Option Explicit
Private Sub Submit_Click()
Dim userInput As String
Dim chatbotResponse As String
' Get user input from cell A2
userInput = Range("A2").Value
' Call OpenAI API to get chatbot response
chatbotResponse = GetOpenAIResponse(userInput)
' Display chatbot response in cell B2
Range("B2").Value = chatbotResponse
End Sub
Function GetOpenAIResponse(userInput As String) As String
Dim objHTTP As Object
Dim apiKey As String
Dim apiUrl As String
Dim jsonBody As String
Dim jsonResponse As String
Dim json As Object
' Set your OpenAI API key here
apiKey = "your_openai_api_key"
' Set OpenAI API endpoint URL
apiUrl = "https://api.openai.com/v1/chat/completions"
' Create JSON request body
jsonBody = "{""messages"": [{""role"": ""system"", ""content"": ""You are a helpful assistant.""}, {""role"": ""user"", ""content"": """ & userInput & """}], ""max_tokens"": 50, ""n"": 1, ""stop"": ""\n"", ""temperature"": 0.7}"
' Create an HTTP object
Set objHTTP = CreateObject("MSXML2.ServerXMLHTTP.6.0")
' Send the request to OpenAI API
With objHTTP
.Open "POST", apiUrl, False
.setRequestHeader "Content-Type", "application/json"
.setRequestHeader "Authorization", "Bearer " & apiKey
.send jsonBody
jsonResponse = .responseText
End With
' Create a JSON parser object
Set json = CreateObject("MSXML2.DOMDocument.6.0")
json.LoadXML jsonResponse
' Extract chatbot response from JSON and return it
GetOpenAIResponse = json.SelectSingleNode("//content").Text
End Function
Replace your_openai_api_key with your actual OpenAI API key.
Save the VBA code by clicking the floppy disk icon or pressing Ctrl+S.
Close the VBA editor by clicking the “X” in the top-right corner.
Step 6: Add Microsoft XML Reference for JSON Parsing
Press Alt+F11 to open the VBA editor.
Click “Tools” > “References” in the menu.
Scroll down and check the box next to “Microsoft XML, v6.0” (or the latest version available).
Click “OK” to close the “References” window.
Step 7: Test Your Excel Chatbot
In cell A2, type a question or statement for your chatbot (e.g., “What is the capital of France?”).
Click the “Submit” button.
The chatbot’s response should appear in cell B2.
(Optional) Step 8: Add Context Awareness to Your Excel Chatbot
In your Excel workbook, create a new sheet and rename it to “ConversationHistory”.
In the VBA editor, modify the Submit_Click() subroutine by adding the following lines of code before the line chatbotResponse = GetOpenAIResponse(userInput):
' Save user input to ConversationHistory sheet With Sheets("ConversationHistory") .Cells(.Cells(.Rows.Count, 1).End(xlUp).Row + 1, 1).Value = "User: " & userInput End With
3. Modify the GetOpenAIResponse() function by adding a new argument called conversationHistory:
Function GetOpenAIResponse(userInput As String, conversationHistory As String) As String
4. Update the Submit_Click() subroutine to pass the conversation history when calling the GetOpenAIResponse() function. Replace the line chatbotResponse = GetOpenAIResponse(userInput) with the following code:
' Get conversation history Dim conversationHistory As String conversationHistory = Join(Application.Transpose(Sheets("ConversationHistory").UsedRange.Value), " ")
' Call OpenAI API to get chatbot response chatbotResponse = GetOpenAIResponse(userInput, conversationHistory)
5. Modify the JSON request body in the GetOpenAIResponse() function to include the conversation history. Replace the line jsonBody = "{""prompt"": ""Chatbot: " & userInput & "\", ""max_tokens"": 50, ""n"": 1, ""stop"": ""\n"", ""temperature"": 0.7}" with the following code:
' Create JSON request body with conversation history jsonBody = "{""prompt"": """ & conversationHistory & " Chatbot: " & userInput & "\", ""max_tokens"": 50, ""n"": 1, ""stop"": ""\n"", ""temperature"": 0.7}"
6. Finally, update the Submit_Click() subroutine to save the chatbot’s response in the “ConversationHistory” sheet. Add the following lines of code after the line Range("B2").Value = chatbotResponse:
' Save chatbot response to ConversationHistory sheet With Sheets("ConversationHistory") .Cells(.Cells(.Rows.Count, 1).End(xlUp).Row + 1, 1).Value = "Chatbot: " & chatbotResponse End With
With this optional modification in place, your Excel chatbot will now maintain a conversation history in the “ConversationHistory” sheet, which will be used to provide context-aware responses. The chatbot will be able to give more accurate and relevant answers based on the user’s previous interactions.
Conclusion
You now have a working Excel-based chatbot powered by the OpenAI API. This guide has provided you with a simple, step-by-step approach to creating a chatbot interface within Excel and integrating it with the OpenAI API using VBA code. While this example uses basic functionality, you can expand and adapt the code to create chatbots for various applications, such as customer service, sales, HR, inventory management, and knowledge management. With the power of the OpenAI API and the familiar Excel interface, businesses can create powerful, user-friendly tools to enhance operations and boost productivity.
This blogpost was created with help from ChatGPT Pro.
In a previous blog post, we explored why Lee Meriwether was the best Catwoman from the 1966 Batman TV Show. However, it’s essential to provide a more in-depth analysis by comparing and contrasting specific scenes and elements that showcase her chemistry and timeless appeal. In this follow-up post, we’ll dive into some iconic scenes that demonstrate why Lee Meriwether stood out among other actresses who portrayed Catwoman.
Scene 1: Meeting Batman and Robin at the Museum (Batman: The Movie, 1966)
One of the most memorable scenes featuring Lee Meriwether’s Catwoman is when she, disguised as Miss Kitka, meets Batman and Robin at the museum. This scene highlights her versatility as an actress, as she convincingly switches between her Russian journalist persona and her cunning feline alter-ego.
Compared to Julie Newmar and Eartha Kitt, Meriwether’s performance in this scene is particularly noteworthy because of her seamless transition between personas. Her ability to portray a believable and charming Miss Kitka showcases her range as an actress and her undeniable chemistry with Adam West’s Batman.
Scene 2: The Seduction of Batman
Lee Meriwether’s Catwoman excelled in her ability to seduce Batman subtly. In Batman: The Movie, there’s a scene where Catwoman, as Miss Kitka, invites Batman to her apartment for a “private interview.” Meriwether’s performance strikes a delicate balance between flirtatious and innocent, allowing her chemistry with Batman to shine.
In contrast, Julie Newmar’s Catwoman was more overtly flirtatious, while Eartha Kitt’s rendition was more aggressive and domineering. Meriwether’s subtlety in this scene demonstrates her unique and timeless appeal, setting her apart from the other portrayals of Catwoman.
Scene 3: The Climactic Battle on the Submarine
In the climactic battle on the submarine in Batman: The Movie, Lee Meriwether’s Catwoman shows her cunning and combat skills. She fights alongside the other villains against Batman and Robin, displaying both her intelligence and physical prowess.
Comparing this scene with other Catwoman portrayals, Meriwether stands out because of her ability to balance the character’s feminine charm and cunning nature. Newmar and Kitt, while both skilled in combat, leaned more towards either seductiveness (Newmar) or fierceness (Kitt). Meriwether’s performance captures the essence of Catwoman in a way that feels both authentic and timeless.
Scene 4: The Final Unmasking
The final unmasking scene, where Batman discovers Catwoman’s true identity, is crucial in showcasing Meriwether’s acting prowess. In this scene, she masterfully switches between her personas, revealing her vulnerability and allowing the audience to sympathize with her character.
Comparatively, Newmar’s and Kitt’s unmasking scenes lacked the same emotional depth. Meriwether’s ability to evoke empathy and portray a multi-dimensional character solidifies her status as the most timeless and engaging Catwoman.
Conclusion
Lee Meriwether’s portrayal of Catwoman in the 1966 Batman TV Show and movie stands out for several reasons. Her ability to transition between personas, subtle seduction, balanced combat skills, and emotional depth in key scenes set her apart from Julie Newmar and Eartha Kitt. These specific scenes and elements demonstrate why Lee Meriwether’s Catwoman has a more timeless appeal and better chemistry with Adam West’s Batman.
This blogpost was created with help from ChatGPT Pro.
When it comes to the iconic 1966 Batman TV series, there’s a lot to remember and love. From the campy humor to the unforgettable theme song, the show remains an indelible part of our pop culture history. One of the most memorable aspects of the show was the rogues’ gallery of colorful villains. Among them, the seductive and cunning Catwoman stands out, portrayed by three different actresses during the series’ run. Julie Newmar, Lee Meriwether, and Eartha Kitt all brought their unique spin to the character, but it was Lee Meriwether who arguably made the most lasting impression, despite only portraying Catwoman in the 1966 Batman Movie. In this blog post, we’ll make a case for why Lee Meriwether was the best Catwoman in the 1966 Batman TV series.
Lee Meriwether’s Catwoman: A Seamless Blend of Danger and Allure
While Julie Newmar and Eartha Kitt were undeniably talented and captivating in their portrayals of Catwoman, Lee Meriwether brought a unique combination of danger and allure to the role. Her performance in the 1966 Batman Movie showcased a Catwoman who was equal parts seductive and cunning. She possessed the ability to outsmart Batman and Robin at every turn, while also luring them into her web of deception. This made her not just an entertaining villain, but a formidable adversary for the Caped Crusader.
A Rich and Layered Performance
Lee Meriwether’s portrayal of Catwoman in the 1966 Batman Movie was not a one-dimensional caricature. She brought depth and nuance to the role, providing a more complex and intriguing character. In the movie, she played a dual role as both Catwoman and Russian journalist Kitka, seducing Bruce Wayne and attempting to manipulate him for her own gains. This added layer allowed Meriwether to showcase her acting range and gave the audience a glimpse into the mind of a cunning and intelligent villain.
A Fresh Take on an Iconic Character
When Lee Meriwether took on the role of Catwoman for the 1966 Batman Movie, she had big shoes to fill, as Julie Newmar had already made her mark as the character in the TV series. However, Meriwether rose to the challenge and managed to bring something fresh and exciting to the role. Her interpretation of Catwoman was not a mere imitation of her predecessor, but rather a distinct and captivating portrayal that resonated with fans of the show and movie alike.
A Timeless Appeal
Lee Meriwether’s Catwoman continues to captivate fans, even decades after the 1966 Batman Movie was released. Her performance remains a touchstone for fans of the character and the series, proving that she made a lasting impact with her portrayal. This is a testament to the strength of her performance and her ability to bring the character to life in a way that resonates with audiences across generations.
Chemistry with the Cast
One of the key aspects of any great performance is the chemistry between the actors. In the 1966 Batman Movie, Lee Meriwether displayed an undeniable chemistry with Adam West (Batman) and Burt Ward (Robin), as well as with the other iconic villains she shared the screen with. This on-screen dynamic elevated the movie and made it even more enjoyable for fans. The chemistry between Meriwether and West was particularly notable, as it lent credibility to their characters’ interactions and allowed the audience to become even more invested in their story.
Conclusion:
While Julie Newmar and Eartha Kitt both made significant contributions to the legacy of Catwoman in the 1966 Batman TV series, it is Lee Meriwether who truly stands out as the best Catwoman. Her seamless blend of danger and allure, her rich and layered performance, her fresh take on an iconic character, her timeless appeal, and her undeniable chemistry with the cast all combine to make her portrayal one for the ages.
It’s important to note that each actress brought something unique to the role of Catwoman, and their individual contributions should not be discounted. However, Lee Meriwether’s performance in the 1966 Batman Movie was so powerful and captivating that it transcends the fact that she only played the character once. Her interpretation of Catwoman stands as a testament to her talent and her ability to make a lasting impact on audiences.
This blogpost was created with help from ChatGPT Pro.
Want to listen to this post instead of reading it? Listen to Virtual Christopher Finlan read this post in its entirety!
Introduction
As artificial intelligence (AI) continues to advance and integrate itself into our everyday lives, it’s essential to examine the ethical implications that come with using such technology. One of these AI applications, ChatGPT-4, has become increasingly popular for its ability to write coherent and well-structured content on a variety of subjects. This raises several questions regarding the ethics of using ChatGPT-4 to write blog posts for individuals or businesses. In this post, we’ll explore some of the ethical concerns surrounding the use of ChatGPT-4 and discuss potential ways to address these issues.
Authenticity and Honesty
One of the primary ethical concerns when using ChatGPT-4 to write blog posts is the question of authenticity. While AI-generated content can be informative and well-written, it lacks the personal touch that comes from a human author. As readers, we appreciate the unique perspectives and experiences that individuals bring to their writing. By using AI-generated content, we risk losing this sense of authenticity.
To address this concern, it is crucial for users of ChatGPT-4 to be transparent about the authorship of their content. Clearly stating that a blog post is AI-generated maintains honesty with readers and allows them to make informed decisions about the content they consume.
Intellectual Property and Credit
Another ethical concern surrounding the use of ChatGPT-4 is intellectual property and giving credit where it is due. AI-generated content is created by an algorithm and doesn’t have a human author to attribute credit to. This can create confusion regarding who should be credited for the work and may inadvertently lead to plagiarism or misattribution.
It is essential to give proper credit to the AI tool used and acknowledge its role in the content creation process. This not only promotes transparency but also ensures that intellectual property is respected.
Bias and Discrimination
AI algorithms like ChatGPT-4 learn from vast amounts of data, including text that may contain biases or discriminatory language. Consequently, the content generated by ChatGPT-4 might unintentionally perpetuate these biases or discriminatory ideas, leading to ethical concerns.
To address this issue, developers of AI algorithms should work to reduce bias and discrimination in their models. Users of ChatGPT-4 should also review the generated content carefully, ensuring that it does not perpetuate harmful stereotypes or discrimination.
Job Displacement
The use of AI-generated content also raises concerns about job displacement. As ChatGPT-4 becomes more capable of producing quality content, it may lead to a reduction in demand for human writers, resulting in job loss for some individuals.
While AI-generated content can be a valuable tool to assist writers, it should not be seen as a replacement for human creativity and expertise. Maintaining a balance between AI-generated and human-written content can help address job displacement concerns.
Conclusion
In conclusion, the use of ChatGPT-4 to write blog posts presents various ethical challenges that need to be considered. To ensure that AI-generated content is utilized responsibly, it is crucial to be transparent about authorship, give proper credit, actively work to reduce bias, and find a balance between AI-generated and human-written content. By considering these ethical concerns, we can make informed decisions on how to use AI tools like ChatGPT-4 while maintaining our values and commitment to responsible technology use.
This blogpost was created with help from ChatGPT Pro.