A dream where I can spin up a PowerApp (or Microsoft Form) in a few minutes, publish it out and the SharePoint list will magically be created. A dream where others will then pick up the baton and perform the data optimizations and connectivity for an array of data sources while avoiding Microsoft CDS licensing requirements. A dream where I can deploy applications in minutes and over the span of the next few hours, days, weeks, the rest of the work to fully build out the data architecture for the application can occur over a longer timeline w/o losing data or slowing down deployments.
I dream this dream almost every day.
A dream of a better different world
So what if we could build out a PowerApp and have it taking inputs/data from an end-user in a matter of minutes and we could quickly throw that data "somewhere" for storage. The funny thing is, nearly every single "low-code" or "no-code" style application starts from the opposite direction:
Let's start with your data!
They all seem to start this conversation thinking that the end-user, the process-builder, or the specifications generated by them know what in the world they're talking about. This is built on the assumption that people who know how to follow processes, know how to build processes.
This likely is an output from a general MBA perspective (I feel dirty) of assuming we've sat around and done a ton of work to verify what it is that is important to the organization. However, in the world of no/low-code, the assumption is the opposite. We don't know what's truly important (at least in how this particular process/app aligns yet w/ organizational goals) and we don't have time to meet 100 times to show you Visio diagrams, Microsoft Project estimates, etc. to deliver 1/10th of what you needed in 10x the timeline of no/low-code.
Certainly there are caveats w/ approaches like this, but it is based on a reality that is much closer to the world where we live:
We don't know what data you need to work with nor do we know what aspect of that data is important
Boy does Power BI actually understand this philosophy. Kind of funny that they seem to have forgotten that within Power Apps.
What I find myself looking for is a way to easily start a process simply from the GUI. Where a Business Analyst can quickly generate a UI in a tool w/ 80% of the functionality needed, that then dumps this data "somewhere" w/o having to get into all the intricacies of data storage, optimization, security, etc.
I want them to be able to build something quickly, get it to a proof-of-concept that is functional, then throw that over the wall to a Data Analyst who can do some validation on it and start deciding if where/when/why we should store each item within a record and/or replicate/link it to somewhere else. Where a Data Security Analyst can then analyze the various data elements to classify, encrypt, and secure them. Where an entire series of staff can work on the back-end to optimize the data and queries on the fly w/o impacting the application. Then eventually, delivering a recommended method of reading/writing to this new optimized and secured data store back for inclusion in the original application.
I want this dream to be reality.
Limitations in the o365 Power Platform
NOTE: I LOVE this entire platform. However, I also recognize when my interests might not align w/ the interests of others
Let's be honest, Microsoft isn't chasing this dream. In fact, our problems are secondary (and perhaps tertiary) to Microsoft's own (a common problem in cloud). They are trying to build out some application models that serve a distinct purpose of building some kind of monster AI that no longer needs to be told we want to take a week off, but which knows when we need it, approves it, schedules it, and books our itinerary. All while also trying to lower their costs/hour and increase revenue/hour while also trying to get this down to per minute if not per second.
The evidence that support this is several-fold.
The CDS (now Dataverse) is a money grab
NOTE: See the bottom of this section on Project Oakdale as Microsoft might have recognized their error on pushing the CDS so very hard.
Microsoft REALLY wants you to use the CDS. And they REALLY want you to license every end-user inside your organization to start using it. It's a big SQL-server cluster that they're so-excited to have purpose-built around Dynamics (an also-ran in the CRM game now turning into...an ERP?) a series of pre-built data structures as well. You can imagine their team having meeting after meeting to optimize the data stores and having repeated orgasms as they link the data tables together into beautifully optimized repositories that barely register a blip in their data centers when we query it.
And yet, I know that I can throw x1000 the GLOBAL computing power in 1970 at any problem I have today and never even connect my desktop computer to the network. So is that really MY PROBLEM you're trying to solve Microsoft by trying to force me into pre-built data structures to same me some teensy bit of time in making them myself?
The reasoning behind disabling HTTPS webservice calls as being included in the base PowerApps licensing (even when talking to Azure services - somebody listened to my bitching apparently) shows this is 100% true. There simply is no other argument.
However, I will admit that I think Microsoft may have seen the light on this. They appear to be changing the licensing around this and perhaps...just perhaps...Project Oakdale will give us a reasonable relational database that will be very accessible (at least for what they're calling Team's Apps).
Power Automate (aka Flow) needs a complete UX rewrite...but
Power Automate / Flow is a dumpster-fire of trying to serve multiple audiences with disparate needs. It is a business-facing automation tool that requires a developer to understand it. It then limits the capabilities of a developer endlessly to make it easy for a business user who still will never understand it. The entire UX is a cluster of things that are pre-built to make it "easy" for you. But is this because the UI is terrible? Or is it terrible by design?
This terrible UX for developers and business-users alike seems to also heavily inspire people to use the canned integrations provided. Canned integrations that are optimized behind the scenes for delivery (to lower costs...for Microsoft)
Delivering a nicer UI would lead you to customize more and more and force them to pay more for the time consumed on their services (using your existing licenses you are already paying for). Like any organization doing cloud, they want you to pay for services you do not use, or they want to minimize the costs of the services you do use. Therefore, using canned, pre-optimized solutions is in their interest.
The addition of the RPA tools to Power Automate now in 2020 shows this in spades. They did the minimum required to sell the product, now they want you to buy the DLC.
Power BI and PowerApps are the loss leaders
There are lots of things I might bitch about on these two platforms on occasion, but for the most part, I think they deliver exactly what the business needs. They are pulling organizations into the o365 ecosystem and companies are heading there willingly or not.
Power BI in fact has so many "features" that allow end-users to bypass corporate security and licensing in some very scary ways that it is clear they wanted this to be something that got "picked up" by a department somewhere, got used to deliver data to leadership, then could not be easily unwound from said leadership's vice-like grip, so IT has to accommodate it. The longer IT ignored the threat (because they're overworked anyway), the worse it got, until they had to concede and license out the rest of the system.
This model plays out time and again around the globe.
And I can't fault Microsoft for delivering on a great product in some sneaky ways designed to enhance uptake. Large organizations are change averse and often use risk-aversion (real or imagined) as the primary excuse . While risks definitely exist and data policies and security must be in place, there truly are limits how tightly any organization can secure their data.
Power BI delivers time and again, so the rewards just win out over the risks.
PowerApps isn't as sneaky per organizational entry, but once you've got the licenses in place, you'd be making a terrible decision not to invest in it. For any internal application UI, PowerApps should be your answer. Full stop. It delivers enough versatility, reliability, and security that there really is no excuse per the UX side of the platform.
The thing is, these two are done so very well, how in the world is everything else behind them such a cluster !$!!@?
I don't work there, so I can only guess (as I have above).
Changing the delivery model to embrace Low/No-code
I've pointed out perhaps why the delivery model currently around PowerApps (and Oracle APEX, and Appian, and Google AppSheet, and...) still appears to be tied to a belief that we have some data we want to start with. And while there are plenty of instances where this is the case, the data we previously captured was captured for reasons that might no longer hold valid. Data that was also limited due to the difficulty in capturing it.
If we can now capture so much more data from an end-user far more easily, shouldn't we be starting our conversation w/ what is the problem we are trying to solve and who are we solving it for?
Let's imagine the answer to that question is:
...whatever makes the end-user's life easier.
Because it is.
Every time.
NOTE: Feel free to go back and punch any/all of your instructors you had during your MBA classes in the throat. Do it once more for good measure.
So while we might start our projects w/ leadership/managers in mind, we ultimately need to arrive at the desktop of the end user. They know what leadership wants to know. They just are too busy to explain it to them. So let's solve for that and start w/ them as the primary point of delivery. If we make it easier for end-users to do their jobs, then we can capture a lot of data along the way and simplify any additional questions we might need to ask to satisfy the desires of leadership.
First Step: JSON to the rescue!
You're sitting there w/ the end-user and they've just finished complaining about the 16-step process for them to get X done that crosses and array of applications, departments, and processes. They're getting all whipped up into a frenzy on how difficult it is going to be to do this because they can barely explain it all to you.
Meanwhile, you calmly create the form that is the beginning of the solution to their problem.
So before they are finished describing the problems that are involved in how "...800 different cashiers and points of sale within our organization need to tell us how much cash they have in their drawers at the end of the day..." and all of the problems and issues they've had around that, you share this via Teams chat:
"I can share this w/ them right now if you'd like..." - You |
By the end of that meeting you deploy your application and send the data...to SHAREPOINT?!?!
Yes. The key is we just need to stash this data "somewhere" temporarily while the rest of the design process slowly moves forward.
Let's see if we can get this done in a way that makes sense.
If I take everything within a Canvas app form and wrap it up in a Collection before I decide to write it out to my data source, then I'm only one command away from what I need to do to convert all of that bundle into a JSON data structure.
ClearCollect(dataToWrite, ...);
Set(jsonDataToWrite, JSON(dataToWrite);
The JSON() function in PowerApps does all of the conversion and escapes out any special characters. So once you've got that, then you can write the whole bundle out to a single SharePoint field similar to:
ClearCollect(
dataToWrite,
{
location: Dropdown1.Selected.Value,
date: DatePicker1.SelectedDate,
fives: TextInput1.Text,
tens: TextInput1_1.Text,
twenties: TextInput1_2.Text,
fifties: TextInput1_3.Text,
benjamins: TextInput1_4.Text
}
);
Set(
jsonDataToWrite,
JSON(dataToWrite)
);
ClearCollect(
dataWritten,
Patch(
cashOnHandReports,
Defaults(cashOnHandReports),
{
Title: Dropdown1.Selected.Value,
rawJSON: jsonDataToWrite
}
)
)
Not enough banjamins, yo! |
A few caveats here:
- Don't use the Title field to store JSON (it's limited to 256 characters)
- Make sure you handle the default logic of unique Title fields either in your app or by changing the settings on the SharePoint list
- Create a multi-line text field to store JSON (it's limited to 63,999 characters)
So if you just have a single SharePoint list that you create similar to the above, you can then go in and create a copy of that same list for any project you start.
The fun thing about this is: you can do this for EVERY SINGLE APPLICATION.The extra fun thing about this is: you can create this within a default Team for new projects and then when you create a new Team, copy from an existing one.
You'd never need to do anything differently from a starting-point perspective. Start here, then enhance as you go. Everything starts from a common data model for storage and a common process-model within Microsoft Teams.
Next Steps
So you've got the data from the end-user, now what?
Well, that's the key part here. The #1 issue solved by low/no-code application is data collection at the edge, closely followed by accessing data at the edge. You've solved the first, but the second is kludgy until we optimize the data. You could do it w/ your prototype application (that will still work), it will just require some optimization over time.
Next up...creating your SharePoint list/columns/types automatically?!? Whhaaaaaat?! Stay tuned.
And I will use my most hated of the Power tools, Power Automate, to do it.
No comments:
Post a Comment
Because some d-bag is throwing 'bot posts at my blog I've turned on full Moderation. If your comment doesn't show up immediately then that's why.
DIAF Visualpath team