In 2009, I started eAlchemy. It’s hard to believe that Minecraft and Bitcoin were launched the same year — and the iPad (2010) was yet to be introduced. Our mission, to create tools which transform data into insight and help businesses make smarter decisions, wasn’t the simplest thing to explain.
In fact, I first launched eAlchemy three years earlier (2006). Most companies at that time were somewhat primitively focused on reporting and report automation. The idea of data-driven decision making? It was still a nascent concept. Usage of our industry’s most notable buzzword, “big data,” really didn’t begin trending until several years after 2005. That’s when O’Reilly Media’s Roger Mougalas was credited with coining the term as it applied to the modern, mass Internet adoption world. (If you missed it, Tim O’Reilly’s 2005 seminal post on “Web 2.0” is a must read, particularly the section about data.)
Web searches for the term “big data” on Google didn’t start to accelerate until after 2010. SOURCE: Google Trends.
When I recommitted to eAlchemy in 2009, the early ideas and technology around data-driven decision making had matured. Businesses were clearly more focused on data, and it was much easier to sell our services. Thus began a sustained period of growth for eAlchemy.
In the past decade, many things have changed in the world of data and analytics — yet some things haven’t shifted as much as I would have expected. People now generally understand the idea of data-driven decision making. However, for some, it’s still hard to grasp what exactly we do here. I often point people to the “work” page on our website to illustrate the custom tools we build, and how they improve business processes and deliver the insights decision makers need today.
As I look out at my company’s next 10 years, I believe our industry is on the verge of Data 2.0 — where our ability to harvest the massive amount of available data for decision making will exponentially increase. And with that, our level of sophistication in how we apply that data in every-day decision making will grow dramatically.
But indulge me for the sake of this post. I want to look back and share 10 truths I’ve learned in our 10 years of building business intelligence tools.
1. The amount of data has dramatically increased (duh)
No one will be surprised to learn that the amount of data we generate is increasing. But just how much? In 2017, Domo estimated that every day, we generate 2.5 quintillion bytes of data — and that 90% of the world’s data had been generated in the previous two years. And Inevitably, the data created by our digital activity continues to increase along with the growth of the IoT’s connected devices. Big data is bigger than ever, but…
2. We’re not doing as much with that data as you might think
The vast majority of data never gets used. A study by IDC in 2012 revealed that only 3% of data generated was tagged and only .5% of all data is ever analyzed. Certainly, those numbers have increased with the improvement of technology in the past few years. We’ve seen first-hand that many companies are engaged in discussions about the best way to harvest the value in their data and some are investing in building tools to do so. But many of those projects ultimately get stifled. Why? Because…
3. Most IT departments are chartered with managing costs, not business intelligence
Many businesses still view information technology (IT) departments as cost centers rather than strategic teams. That means, at the core, these groups are measured by their ability to manage expenses. Projects that involve basic needs get approved. But projects that don’t involve basic, urgent needs — such as harvesting data and boosting business intelligence — may not get greenlighted. In particular, we’ve seen many massive ERP projects get approved because executives need to centralize and plan basic business operations. But the specific tools that a business analyst might need to conduct more thoughtful analysis and to help make more strategic decisions? and make day-to-day decisions? Those projects often don’t make the cut — even if there’s the potential of a massive return on investment (ROI).
The good news for data-driven pros? IT’s role in data-related projects is starting to shift. In fact, among our clients, we’re more frequently working directly with the line of business, and not just IT…
4. Line of business increasing investment in technology, hiring technology experts
In the past 10 years, one of the most obvious changes we’ve seen is that the business groups chartered with growth are hiring their own technology and system specialists to drive tech projects — rather than relying upon internal IT organizations. It makes sense – traditional IT has often been a blocker to investment in technology that’s more forward-looking or experimental. That’s because these departments are typically measured based on their ability to roll out projects and close help desk tickets.
If a company sees IT as a cost center with the purpose of “keeping the lights on” then executives should empower lines of business to invest in their own tools where they see growth potential. That includes the creation of data tools that improve business intelligence and decision making. And having someone sitting inside a line of business, and truly understanding the specific problems and opportunities, increases the chances of success for technology investment.
IDC predicted that line of business spending on technology will outpace IT this year. In particular, we’re seeing this more and more in retail supply chain, where groups organized by product line are hiring business system specialists that report to VPs in those groups. IT is still typically involved, as often, these business system specialists will have a dotted line to IT. Unfortunately, however, in some cases, there’s no line — or any collaboration between IT and the line of business…
5. Shadow IT projects still very much a thing
As one technologist proclaimed in 2005, “IT is from Mars, Business users are from Venus.” Sadly, we’ve seen that’s still true in many places. We’ve seen that the relationship between IT and the line of business can be so fractured that technology is implemented without IT even being made aware of it.
These projects, sometimes referred to as “shadow IT,” can become problematic for a couple of reasons. One, data governance across an organization is becoming more crucial as a company’s proprietary data grows — and these skunkworks projects can lead to data discrepancies. Two, many times a business user may invest in a tool that someone in another group could benefit from. Without a centralized resource being involved, such as IT, there are missed opportunities to more broadly roll out that tool and increase the ROI of a project.
Ideally, as more line of business groups hire systems/tech people, they’ll also be chartered with understanding how the technology they’re investing in might plug into the larger organization. Because shadow IT isn’t good for anyone. And, we can hope that IT organizations can become an accelerant rather than a drag on the velocity of these efforts.
6. There’s too much emphasis on cost — and not enough on ROI
There are two sides to every technology business case: the cost of building something vs. the return on investment of that project. As consultants, we rely on our clients’ internal stakeholders building that business case and showcasing a clear ROI — then ultimately getting budget approval for our projects. Of course, we’re biased: We think companies should aggressively invest in data projects with a clear ROI.
But we still see projects get blocked because fixed calendar-year budgets put in place months earlier simply can’t be adjusted to account for the up-front project costs — regardless of how much anticipated return there may be on that investment. The companies that are winning are moving away from fixed budgets for tech investment. Instead, management allocates funds based on current need and opportunity — not by historical spending or budgeting. They prioritize long-term ROI more than short-term expense.
Another byproduct of fixed-budget environments: We see stakeholders try to hack solutions together on the cheap…
7. Low-cost solutions often end up being expensive
As I wrote in a post about how to choose software development partners, “Everything has a price — but in the world of software development, if you go cheap at the start, you’re more than likely to pay for it later.”
We still see many companies make decisions to invest in low-cost — and often off-shore — development resources. In some cases, we’ve benefited from these poorly implemented solutions because we get a phone call for help. And we fix them. But for the companies involved, these become very expensive projects. Because not only does much of the work need to be re-done, the schedule for the project is extended and the opportunity cost of having a functional tool earlier is lost.
8. Machine learning is ready for investment
As I wrote in a recent post about our investment in machine learning, I’m convinced that “machine learning (ML) is going to change how businesses make decisions, at almost every level. And this is especially true where I’ve spent most of my career — at the intersection of data analytics and supply chain management (SCM).”
What’s interesting about machine learning is its potential to be a great equalizer for medium-sized companies and smaller lines of business. A well-tuned and trained machine learning algorithm can instantly analyze massive amounts of data — something that might take human teams days or weeks to process. ML has the potential to boost business intelligence for medium-sized companies in a way that was only possible for resource-rich enterprises in the past.
And I also believe, that as we enter Data 2.0, machine learning is ready for investment — especially in supply chain. Like I said, “We’ve arrived at a tipping point moment: The cost and effort of creating a machine learning SCM solution are far exceeded by the benefit of implementing it.”
9. Data can really (be a time) suck
I’m proud of how the tools we build unlock data and transform that data into insight. But if you had to use one word that defines what we do? Automation. In this era of big data, business analysts are spending an increasing amount of time simply searching for the specific data they need. They unnecessarily burn hours digging it out of complex ERPs, spreadsheets, databases, and emails. And they tediously build the same reports week after week after week. There’s a better way.
Our custom applications — applying technology, data science, and user interface design — automate the reporting process. These automated tools unlock data, refine it, and then present it in its most useful form for the decision-makers who need it. And tools like these save teams hours of time reporting so they can focus on doing what they were hired to do — make smarter decisions to move the business forward.
10. For most companies, there’s more value in little data than big data
There’s still a lot of data that isn’t being harvested today that could be (as noted above, in 2012, only .5% of all data was being analyzed). And I believe that for most companies today, there’s more value to be found in “not-so-big data” than there is in big data.
As I wrote last year (“Why not-so-big data may be your best data”), “Despite the buzz around big data, most companies will see higher returns more quickly using their not-so-big data. Data they can use to make better decisions right now.”
While big companies may be building big-data solutions, companies that have yet to reach a basic level of data maturation are better-suited solving problems using their not-so-big data. For example, tools that automate reports and save employees valuable time or custom views of data using visualization tools and algorithms that provide team members and executives real-time insights for making better decisions.
Big data is cool, of course, but building big data solutions for many of these daily operations problems would be impractical and expensive for most of my clients.