Showing posts with label formatting. Show all posts
Showing posts with label formatting. Show all posts

26 February 2025

💎🤖SQL Reloaded: Copilot Stories (Part III: Formatting Code via Prompts)

When debugging poorly formatted code, usually it helps to format first the code before doing any other work on it. I observed in the previous post that Copilot can be used to restructure the code based on a set of commands that summarize the best practices. 

Example 1: Simple Query

Let's start with a simple example from the web

Prompt:
Format the below code by using the available SQL best practices
1) each join condition is written on its own line
2) the commas are placed at the beginning of each line
3) use 3 letter based aliases for all tables
4) compact the code by reducing the indentation

SELECT po.ExpectedDeliveryDate, po.OrderDate, pol.LastReceiptDate, pol.LastEditedWhen, p.ValidFrom, p.ValidTo FROM Purchasing.PurchaseOrders AS po JOIN Purchasing.PurchaseOrderLines AS pol ON pol.PurchaseOrderID = po.PurchaseOrderID JOIN Application.People AS p ON p.PersonID = po.ContactPersonID WHERE LastReceiptDate > '1/1/2016';

Output:

SELECT po.ExpectedDeliveryDate
    , po.OrderDate
    , pol.LastReceiptDate
    , pol.LastEditedWhen
    , p.ValidFrom
    , p.ValidTo
FROM Purchasing.PurchaseOrders AS po
JOIN Purchasing.PurchaseOrderLines AS pol
    ON pol.PurchaseOrderID = po.PurchaseOrderID
JOIN Application.People AS p
    ON p.PersonID = po.ContactPersonID
WHERE LastReceiptDate > '1/1/2016';

Of course, the colors were added by the hilite.me formatting to better identify the reserved words. The output of the above prompt is not exactly what I intended in 3), so repeating the slightly changed command seems to obtain the expected result. Even if a 3-letter alias was not used for all tables, the query looks better:

Prompt:
Use 3 letter based aliases for all tables.

Output:

SELECT po.ExpectedDeliveryDate
    , po.OrderDate
    , pol.LastReceiptDate
    , pol.LastEditedWhen
    , ppl.ValidFrom
    , ppl.ValidTo
FROM Purchasing.PurchaseOrders AS po
JOIN Purchasing.PurchaseOrderLines AS pol
    ON pol.PurchaseOrderID = po.PurchaseOrderID
JOIN Application.People AS ppl
    ON ppl.PersonID = po.ContactPersonID
WHERE LastReceiptDate > '1/1/2016';

The script looks good, though it would be great if we could align the constraints with the order of the tables.

Prompt:
Rewrite the query by using the first table from the join in the join constraint.

Output:

SELECT po.ExpectedDeliveryDate
    , po.OrderDate
    , pol.LastReceiptDate
    , pol.LastEditedWhen
    , ppl.ValidFrom
    , ppl.ValidTo
FROM Purchasing.PurchaseOrders AS po
JOIN Purchasing.PurchaseOrderLines AS pol
    ON po.PurchaseOrderID = pol.PurchaseOrderID
JOIN Application.People AS ppl
    ON po.ContactPersonID = ppl.PersonID
WHERE LastReceiptDate > '1/1/2016';

As observed in the last post, multiple scripts can be transformed withing the same prompt, especially when the statements were generated by Copilot as well. 

Example 2: Complex Dynamics AX Query

How about a more complex query? 
Let's consider the below query written for Dynamics AX:

Prompt:
Format the below code by using the available SQL best practices
1) each join condition is written on its own line
2) the commas are placed at the beginning of each line
3) use 3 letter based aliases for all tables
4) compact the code by reducing the indentation

SELECT ADR.DATAAREAID, DPT.FIRSTNAME, DPT.LASTNAME, DPT.LANGUAGEID, DPR.PartyId, DPR.IsPrimary, CASE DPR.Status WHEN 0 THEN 'Active' WHEN 1 THEN 'Inactive' END Status, NullIf(Cast(DPR.ValidFromDateTime as Date), '31-12-1900') ValidFromDateTime, NullIf(Cast(DPR.ValidToDateTime as Date), '30-12-2154') ValidToDateTime, DPR.Shared  , ADR.ADDRTABLEID , ADR.ADDRRECID , ADR.TYPE , CASE ADR.TYPE WHEN 0 THEN 'None' WHEN 1 THEN 'Invoice' WHEN 2 THEN 'Delivery'WHEN 3 THEN 'Alt. delivery'WHEN 4 THEN 'SWIFT'WHEN 5 THEN 'Payment' WHEN 6 THEN 'Service' WHEN 7 THEN 'Home' WHEN 8 THEN 'Other' WHEN 9 THEN 'Business' WHEN 10 THEN 'RemitTo' WHEN 11 THEN 'Third-party shipping address' END AddressDescription, ADR.NAME , ADR.ADDRESS , ADR.STREET, ADR.ZIPCODE , ADR.CITY , ADR.STATE , ADR.COUNTY , ADR.COUNTRYREGIONID , ADR.PHONE , ADR.TELEFAX , ADR.URL FROM ADDRESS ADR JOIN DIRPARTYADDRESSRELATIONSHI1066 DAC ON DAC.REFCOMPANYID = ADR.DATAAREAID and DAC.ADDRESSRECID = ADR.RECID and DAC.DATAAREAID = ADR.DATAAREAID JOIN DIRPARTYADDRESSRELATIONSHIP DPR on DPR.RECID = DAC.PARTYADDRESSRELATIONSHIPRECID and DPR.DATAAREAID = DAC.DATAAREAID JOIN DIRPARTYTABLE DPT on DPT.PARTYID = DPR.PARTYID and DPT.DATAAREAID = DPR.DATAAREAID  JOIN EmplTable EMP on DPT.PARTYID = EMP.PARTYID and DPT.DATAAREAID = EMP.DATAAREAID  

Output:

SELECT 
    ADR.DATAAREAID
    , DPT.FIRSTNAME
    , DPT.LASTNAME
    , DPT.LANGUAGEID
    , DPR.PartyId
    , DPR.IsPrimary
    , CASE DPR.Status 
        WHEN 0 THEN 'Active' 
        WHEN 1 THEN 'Inactive' 
      END AS Status
    , NullIf(Cast(DPR.ValidFromDateTime AS DATE), '31-12-1900') AS ValidFromDateTime
    , NullIf(Cast(DPR.ValidToDateTime AS DATE), '30-12-2154') AS ValidToDateTime
    , DPR.Shared
    , ADR.ADDRTABLEID
    , ADR.ADDRRECID
    , ADR.TYPE
    , CASE ADR.TYPE 
        WHEN 0 THEN 'None' 
        WHEN 1 THEN 'Invoice' 
        WHEN 2 THEN 'Delivery'
        WHEN 3 THEN 'Alt. delivery'
        WHEN 4 THEN 'SWIFT'
        WHEN 5 THEN 'Payment' 
        WHEN 6 THEN 'Service' 
        WHEN 7 THEN 'Home' 
        WHEN 8 THEN 'Other' 
        WHEN 9 THEN 'Business' 
        WHEN 10 THEN 'RemitTo' 
        WHEN 11 THEN 'Third-party shipping address' 
      END AS AddressDescription
    , ADR.NAME
    , ADR.ADDRESS
    , ADR.STREET
    , ADR.ZIPCODE
    , ADR.CITY
    , ADR.STATE
    , ADR.COUNTY
    , ADR.COUNTRYREGIONID
    , ADR.PHONE
    , ADR.TELEFAX
    , ADR.URL
FROM 
    ADDRESS ADR
JOIN 
    DIRPARTYADDRESSRELATIONSHI1066 DAC 
    ON DAC.REFCOMPANYID = ADR.DATAAREAID 
    AND DAC.ADDRESSRECID = ADR.RECID 
    AND DAC.DATAAREAID = ADR.DATAAREAID
JOIN 
    DIRPARTYADDRESSRELATIONSHIP DPR 
    ON DPR.RECID = DAC.PARTYADDRESSRELATIONSHIPRECID 
    AND DPR.DATAAREAID = DAC.DATAAREAID
JOIN 
    DIRPARTYTABLE DPT 
    ON DPT.PARTYID = DPR.PARTYID 
    AND DPT.DATAAREAID = DPR.DATAAREAID
JOIN 
    EmplTable EMP 
    ON DPT.PARTYID = EMP.PARTYID 
    AND DPT.DATAAREAID = EMP.DATAAREAID;

The query looks good, though it needs more transformations:

Input (to be run line by line):

Rewrite the query by writing names in camel with the beginning letter in upper case.
Rewrite the query by writing table aliases in upper case.

Output:

SELECT 
      ADR.DataAreaId
    , DPT.FirstName
    , DPT.LastName
    , DPT.LanguageId
    , DPR.PartyId
    , DPR.IsPrimary
    , CASE DPR.Status 
        WHEN 0 THEN 'Active' 
        WHEN 1 THEN 'Inactive' 
      END AS Status
    , NullIf(Cast(DPR.ValidFromDateTime AS DATE), '31-12-1900') AS ValidFromDateTime
    , NullIf(Cast(DPR.ValidToDateTime AS DATE), '30-12-2154') AS ValidToDateTime
    , DPR.Shared
    , ADR.AddrTableId
    , ADR.AddrRecId
    , ADR.Type
    , CASE ADR.Type 
        WHEN 0 THEN 'None' 
        WHEN 1 THEN 'Invoice' 
        WHEN 2 THEN 'Delivery' 
        WHEN 3 THEN 'Alt. Delivery' 
        WHEN 4 THEN 'Swift' 
        WHEN 5 THEN 'Payment' 
        WHEN 6 THEN 'Service' 
        WHEN 7 THEN 'Home' 
        WHEN 8 THEN 'Other' 
        WHEN 9 THEN 'Business' 
        WHEN 10 THEN 'RemitTo' 
        WHEN 11 THEN 'Third-Party Shipping Address' 
      END AS AddressDescription
    , ADR.Name
    , ADR.Address
    , ADR.Street
    , ADR.ZipCode
    , ADR.City
    , ADR.State
    , ADR.County
    , ADR.CountryRegionId
    , ADR.Phone
    , ADR.Telefax
    , ADR.Url
FROM 
    dbo.Address ADR
JOIN 
    dbo.DirPartyAddressRelationship1066 DAC 
    ON DAC.RefCompanyId = ADR.DataAreaId 
    AND DAC.AddressRecId = ADR.RecId 
    AND DAC.DataAreaId = ADR.DataAreaId
JOIN 
    dbo.DirPartyAddressRelationship DPR 
    ON DPR.RecId = DAC.PartyAddressRelationshipRecId 
    AND DPR.DataAreaId = DAC.DataAreaId
JOIN 
    dbo.DirPartyTable DPT 
    ON DPT.PartyId = DPR.PartyId 
    AND DPT.DataAreaId = DPR.DataAreaId
JOIN 
    dbo.EmplTable EMP 
    ON DPT.PartyId = EMP.PartyId 
    AND DPT.DataAreaId = EMP.DataAreaId;

Example 3: Simple X++ Query

How about code written in other languages (e.g. X++)?

I tried to transform the definition behind X++ form from Dynamics 365 for Finance without effect

    InventItemPurchSetup.CalendarDays = NoYes::No;
    InventItemPurchSetup.HighestQty = 0.00;
    InventItemPurchSetup.InventDimId = "AllBlank";
    InventItemPurchSetup.InventDimIdDefault = "AllBlank";
    InventItemPurchSetup.ItemId = "1000";
    InventItemPurchSetup.LeadTime = 0;
    InventItemPurchSetup.LowestQty = 0.00;
    InventItemPurchSetup.MandatoryInventLocation = NoYes::No;
    InventItemPurchSetup.MandatoryInventSite = NoYes::No;
    InventItemPurchSetup.MultipleQty = 0.00;
    InventItemPurchSetup.Override = NoYes::No;
    InventItemPurchSetup.StandardQty = 0.00;
    InventItemPurchSetup.Stopped = NoYes::No;
    InventItemPurchSetup.Sequence = 0;
    InventItemPurchSetup.OverrideDefaultStorageDimensions = NoYes::No;
    InventItemPurchSetup.SysRowVersionNumber = 0;
    InventItemPurchSetup.insert();

Multiple attempts were performed without any progress, though after a break I was able to apply the following transformations:

Prompts:

Replace the word "InventItemPurchSetup") with ", IPS." and remove the text after the equality sign.
Add a comma in front of each line and remove the semicolon at the end.

With a few manual change the code becomes:

SELECT IPS.CalendarDays 
, IPS.HighestQty 
, IPS.InventDimId 
, IPS.InventDimIdDefault 
, IPS.ItemId 
, IPS.LeadTime 
, IPS.LowestQty 
, IPS.MandatoryInventLocation 
, IPS.MandatoryInventSite 
, IPS.MultipleQty 
, IPS.Override 
, IPS.StandardQty 
, IPS.Stopped 
, IPS.Sequence 
, IPS.OverrideDefaultStorageDimensions 
, IPS.SysRowVersionNumber 
FROM dbo.InventItemPurchSetup IPs

The prompts reflect what I was doings in Excel manually via a replace, respectively the use of "Text to columns" functionality".
 
Probably, there's the opportunity to test further various ideas. Frankly, the transformation from upper case to camel could have helped me in the past a lot!

Disclaimer:
The queries were not tested after the changes were performed (because of unavailability of the systems).

Happy coding!

31 January 2025

🌌🏭KQL Reloaded: First Steps (Part II: Working with Dates)

Knowing how to work with dates and manipulate them accordingly is probably one of the important pieces of knowledge in any query language. Fortunately, KQL has many date-related functions that can help developers in the process and the available documentation is detailed enough to support users in learning their use. The current post focuses on the main uses of dates. 

Please note that the scripts consider the main time units. Please check the documentation for functions' call for other time units.

Create dates via make_datetime, now and ago functions:

// creating dates from parts
print datetime1 = make_datetime(2025,1,1)
, datetime2 = make_datetime(2025,1,1,0,0,0)
, datetime3 = make_datetime(2025,1,1,1,2,3)

// creating dates from values
print date1 = datetime(2024-02-01)
, datetime(2024-02-01 00:00:00)
, datetime(2024-02-01 23:59:59)
, datetime(2024-02-01 23:59:59.123)

// now vs todatetime (return the same value)
print datetime1 = now()
, datetime2 = todatetime(now())
, datetime3 = ago(0h)

Extract information from dates via datetime_part, substring and individual functions: getyear, week_of_year, monthofyear, dayofyear, hourofday, dayofmonth, dayofweek, dayofyear, dayofmonth, dayofweek, endofyear, endofweek, endofmonth, endofday, startofyear, startofweek, startofmonth, startofday:

// date parts front date
print now = now()
, year = datetime_part('year', now())
, month = datetime_part('month', now())
, day = datetime_part('day', now())
, hour = datetime_part('hour', now())
, minute = datetime_part('minute', now())
, second = datetime_part('second', now())

// date parts from string 
let t = datetime("2024-12-31 10:35:59");
print year = substring(t, 0, 4) 
, month = substring(t, 5, 2) 
, day = substring(t, 8, 2) 
, hour = substring(t, 11, 2) 
, minute = substring(t, 14, 2) 
, second = substring(t, 17, 2) 

// date parts via functions
print year = getyear(now())
, week = week_of_year(now())//ISO 8601 compliant
, month = monthofyear(now())
, day = dayofyear(now())
, hour = hourofday(now())

// day functions
print year = dayofyear(now())
, month = dayofmonth(now())
, day = dayofweek(now())

// end of time dates
print year = endofyear(now())
, week = endofweek(now())
, month = endofmonth(now())
, day = endofday(now())

// start of time dates
print year = startofyear(now())
, week = startofweek(now())
, month = startofmonth(now())
, day = startofday(now())

//time units
print hours1 = time(1hour)
, minutes1 = time(1min)
, seconds1 = time(1second)
, hours2 = totimespan("0.01:00:00")
, minutes2 = totimespan("0.00:01:00")
, seconds2 = totimespan("0.00:00:01")

Working with dates via datetime_add and datetime_diff functions:

// adding time units
print year = datetime_add('year',1,now())
, month = datetime_add('month',1,now())
, day = datetime_add('day',1,now())
, hour = datetime_add('hour',1,now())
, minutes = datetime_add('minute',1,now())
, seconds =  datetime_add('second',1,now())

// data differences
print years = datetime_diff('year', now(), ago(1h))
, months = datetime_diff('month', now(), ago(1h))
, days = datetime_diff('day', now(), ago(1h))
, hours = datetime_diff('hour', now(), ago(1h))
, minutes = datetime_diff('minute', now(), ago(1h))
, seconds = datetime_diff('second', now(), ago(1h))

Working with time zones via datetime_local_to_utc and datetime_utc_to_local functions:

// local time across time zones
print ParisTime = datetime_local_to_utc(now(),'Europe/Paris')
, LondonTime = datetime_local_to_utc(now(),'Europe/London')
, LondonTime = datetime_local_to_utc(now(),'Europe/Budapest')
, AthensTime = datetime_local_to_utc(now(),'Europe/Athens') // local time across time zones print ParisTime = datetime_utc_to_local(now(),'Europe/Paris') , LondonTime = datetime_utc_to_local(ago(1h),'Europe/London') , BudapestTime = datetime_utc_to_local(ago(-1h),'Europe/Budapest') , AthensTime = datetime_utc_to_local(ago(-2h),'Europe/Athens')

Applying different formatting with tostring, format_timespan and format_datetime functions:

// date and time to string
print date1 = tostring(datetime(2024-02-01 00:00:00))
, time1 = tostring(totimespan("0.01:02:03"))
, hours1 = tostring(time(1hour))

// formatting timespans
let t = time("25.10:35:59.123456");
print date1 = format_timespan(t, 'dd.hh:mm:ss:FF')
, date2 = format_timespan(t, 'ddd.h:mm:ss [ffff]')

// formatting dates
let t = datetime("2024-12-31 10:35:59");
print date1 = format_datetime(t,'yyyy-MM-dd [HH:mm:ss]')
, date2 = format_datetime(t,'yy-MM-dd HH:mm:ss')
, date3 = format_datetime(t,'yy-MM-dd [HH:mm:ss tt]')

Considering the above functions, one has a good basis for working with dates. 

Happy coding!

Previous Post <<||>> Next Post

29 January 2025

🌌🏭KQL Reloaded: First Steps (Part I: Simple Queries)

If one has followed the Microsoft training, webcasts and other resources, it becomes clear that the Kusto Query Language (KQL) can't be ignored as it's useful in various scenarios thar deal with large volumes of data. 

Even if KQL was created to query big tables, at least for the first steps it's recommended to start with a small data model, and when the basics were understood one can move to bigger databases. Moreover, it's easier to validate the logic when using small datasets.

Probably, the most important aspect before starting is that KQL is case-sensitive and this applies to everything – table and column names, operators, functions, etc. The below code can be tried in the online explorer (see [2], [3]), which makes available several databases for playing around. The users might need to register beforehand.

The following queries are based on the ContosoSales database (available in the above mentioned link). First, here are some simple projections. Each query is preceded by its short description in which the text was commented via "//" and must be run individually. 

// selecting all records
Customers 

// selecting all records
Customers 
| take 10

// multiple filters
Customers 
| where CityName == 'Berkeley'
| where Occupation != 'Professional'
| take 10

// multiple filters on the same column
Customers 
| where Occupation == 'Professional' and Occupation == 'Clerical')
| take 10

// multiple filters on the same column
Customers 
| where Occupation in ('Professional','Clerical')
| take 10

// multiple filters on the same column
Customers 
| where not(Occupation in ('Professional','Clerical'))
| take 10

//subset of columns
Customers
| take 5
| project ContinentName, CityName, FirstName, LastName, Gender

Here are some example for the selection of unique values, the equivalent of SELECT DISTINCT from SQL:

//distinct values used
Customers
| distinct  Occupation

//distinct values used sorted ascendingly 
Customers
| distinct  Occupation
| sort by Occupation asc

//combinations of values used
Customers
| distinct  Occupation, Education
| sort by Occupation asc, Education asc

When further data is needed, one needs to resume to grouping values, the equivalent of GROUP BY:

// record count
Customers 
| count

// record count for constraint
Customers 
| where CityName == 'Berkeley'
| count

// record count for constraint: returns 0 records (KQL is case sensitive)
Customers 
| where CityName == 'BERKELEY'
| count

// numnber of records by occupation 
Customers
| summarize occupations_count = count() by Occupation

// numnber of records by occupation with bar chart visual
Customers
| summarize occupations_count = count() by Occupation
| render barchart

The last query renders the data directly to a bar chart, which is a cool feature, especially when is needed to understand the distribution of values. Executing the query without the last line renders the initial dataset.

Azure Data Explorer - Chart example
Azure Data Explorer - Chart Example

Here are some first impressions:
1) The language is relatively simple, though the transition from SQL to KQL requires time, even if the thinking process of writing the code is not that far away. For those with no SQL knowledge, the transition might be a bit more challenging, though practice makes perfect!
2) One can try to run the code line by line to understand the steps used.
3) There are also some online code converters from SQL to KQL (see for example msticpy).
4) The intellisense capabilities facilitate the overall experience. 
5) Unfortunately, there seems to be no code formatter to HTML for KQL, so one needs to compromise in one way or another.
6) For a comparison between SQL and KQL see [4].

Happy coding!

|>> Next Post 

References:
[1] Microsoft Learn (2024) Kusto Query Language learning resources [link]
[2] TestingSpot Blog (2021) Learning Kusto Query Language - A tool for performance test engineers [link]
[3] Azure Data Explorer (2025] Create a free cluster to explore your data [link]
[4] Microsoft Learn (2024) SQL to Kusto Query Language cheat sheet [link]

Resources:
[R1] GitHub repository (2022) Sentinel Queries [link]
[R2] GitHub repository (2022) Kusto Query Language Grammar [link]
[R3] GitHub repository (2024) The Definitive Guide to KQL source code [link]
[R4[ Git Hub repository (2022) Learning Kijo [link]
[R5] Kusto gamification [link]
[R6] Clickhouse KQL  [link]
[R6] KQL Cafe user group [link]

19 April 2024

⚡️Power BI: Preparatory Steps for Creating a Power BI Report

When creating a Power BI report consider the following steps when starting the actual work. The first five steps can be saved to a "template" that can be reused as starting point for each report.

Step 0: Check Power BI Desktop's version

Check whether you have the latest version, otherwise you can download it from the Microsoft website.
Given that most of the documentation, books and other resources are in English, it might be a good idea to install the English version.

Step 1: Enable the recommended options

File >> Options and settings >> Options >> Global >> Data Load:
.>> Time intelligence >> Auto date/time for new files >> (uncheck)
.>> Regional settings >> Application language >> set to English (United States)
.>> Regional settings >> Model language >> set to English (United States)

You can consider upon case also the following options (e.g. when the relationships are more complex than the feature can handle):
File >> Options and settings >> Options >> Current >> Data load:
.>> Relationship >> Import relationships from data sources on first load >> (uncheck)
.>> Relationship >> Autodetect new relationships after data is loaded >> (uncheck)

Step 2: Enable the options needed by the report

For example, you can enable visual calculations:
File >> Options and settings >> Options >> Preview features >> Visual calculations >> (check)

Comment:
Given that not all preview features are stable enough, instead of activating several features at once, it might be a good idea to do it individually and test first whether they work as expected. 

Step 3: Add a table for managing the measures

Add a new table (e.g. "dummy" with one column "OK"):

Results = ROW("dummy", "OK")

Add a dummy measure that could be deleted later when there's at least one other measure:
Test = ""

Hide the "OK" column and with this the table is moved to the top. The measures can be further organized within folders for easier maintenance. 

Step 4: Add the Calendar if time analysis is needed

Add a new table (e.g. "Calendar" with a "Date" column):

Calendar = Calendar(Date(Year(Today()-3*365),1,1),Date(Year(Today()+1*365),12,31))

Add the columns:

Year = Year('Calendar'[Date])
YearQuarter = 'Calendar'[Year] & "-Q" & 'Calendar'[Quarter]
Quarter = Quarter('Calendar'[Date])
QuarterName = "Q" & Quarter('Calendar'[Date])
Month = Month('Calendar'[Date])
MonthName = FORMAT('Calendar'[Date], "mmm")

Even if errors appear (as the columns aren't listed in the order of their dependencies), create first all the columns. Format the Date in a standard format (e.g. dd-mmm-yy) including for Date/Time for which the Time is not needed.

To get the values in the visual sorted by the MonthName:
Table view >> (select MonthName) >> Column tools >> Sort by column >> (select Month)

To get the values in the visual sorted by the QuarterName:
Table view >> (select QuarterName) >> Column tools >> Sort by column >> (select Quarter)

With these changes the filter could look like this:


Step 5: Add the corporate/personal theme

Consider using a corporate/personal theme at this stage. Without this the volume of work that needs to be done later can increase considerably. 

There are also themes generators, e.g. see powerbitips.com, a tool that simplifies the process of creating complex theme files. The tool is free however, users can save their theme files via a subscription service.

Set canvas settings (e.g. 1080 x 1920 pixels).

Step 6: Get the data

Consider the appropriate connectors for getting the data into the report. 

Step 7: Set/Validate the relationships

Check whether the relationships between tables set by default are correct, respectively set the relationships accordingly.

Step 8: Optimize the data model

Look for ways to optimize the data model.

Step 9: Apply the formatting

Format numeric values to represent their precision accordingly.
Format the dates in a standard format (e.g. "dd-mmm-yy") including for Date/Time for which the Time is not needed.

The formatting needs to be considered for the fields, measures and metrics added later as well. 

Step 10: Define the filters

Identify the filters that will be used more likely in pages and use the Sync slicers to synchronize the filters between pages, when appropriate:
View >> Sync slicers >> (select Page name) >> (check Synch) >> (check Visible)

Step 11: Add the visuals

At least for report's validation, consider using a visual that holds the detail data as represented in the other visuals on the page. Besides the fact that it allows users to validate the report, it also provides transparence, which facilitates report's adoption. 

Related Posts Plugin for WordPress, Blogger...

About Me

My photo
Koeln, NRW, Germany
IT Professional with more than 24 years experience in IT in the area of full life-cycle of Web/Desktop/Database Applications Development, Software Engineering, Consultancy, Data Management, Data Quality, Data Migrations, Reporting, ERP implementations & support, Team/Project/IT Management, etc.