When most people think of Amazon, they think of being able to order just about anything from their phone and getting it delivered to their doorsteps in a matter of days (sometimes even in a matter of hours).
But if Amazon dominates e-commerce — and was named the world’s most valuable brand for three years running as a result — it’s because it doesn’t see itself as an online retailer.
First and foremost, Amazon is a data company.
Data fuels its mission ‘to be the Earth’s most customer-centric company.’
More to the point, being data-first has enabled Amazon to successfully expand beyond e-commerce, and build market-leading products like Alexa and the standard-setting Amazon Web Services.
In asset management, data is equally key. From portfolio selection to client advisory, decision-making, and regulatory reporting, many business-critical activities are grounded in data.
But where Amazon has taken a forward-thinking approach to managing their data, asset managers lag behind. And the outdated practices many firms have in place make it harder for them to operate efficiently and serve their customers to the best of their abilities.
Given the current state of data management and the growing importance of data in the asset management industry, it’s high time firms took a page or two out of Amazon’s book.
So what can they learn from the way Amazon approaches data management?
Amazon’s data manifesto
Jeff Bezos set Amazon’s approach to data in a 2002 memo. While the actual text isn’t publicly available, it’s widely paraphrased as follows:
- All teams will henceforth expose their data and functionality through service interfaces [You might know service interfaces as APIs, or application program interfaces]
- Teams must communicate with each other through these interfaces
- There will be no other form of interprocess communication allowed: no direct linking, no direct reads of another team’s data store, no shared-memory model, no back-doors whatsoever. The only communication allowed is via service interface calls over the network
- It doesn’t matter what technology they use. HTTP, Corba, Pubsub, custom protocols — doesn’t matter
- All service interfaces, without exception, must be designed from the ground up to be externalizable. That is to say, the team must plan and design to be able to expose the interface to developers in the outside world. No exceptions
- Anyone who doesn’t do this will be fired.
- Thank you; have a nice day!
Here are five lessons asset managers can take away from this manifesto and apply to their business.
Lesson 1: It’s easier to exchange data when the process has been standardised
Asset managers face data challenges for many complex, interconnected reasons. But, in our experience, they often boil down to a lack of access to an accurate, well-understood source of data and unclear ownership.
Who holds a specific data set? Where does data come from and who is it sent to? And who is responsible for verifying that data and ensuring it stays accurate and up to date?
Unclear sourcing and ill-defined responsibilities mean those in data governance or data quality roles are often lumped with the task of addressing issues despite not being the source of the problematic data or in control of its source and lineage. As a result, there’s either little accountability, because it’s shared too broadly, or, worse, no accountability at all.
Amazon’s data manifesto overcomes this challenge by standardising data interactions. All data exchange must happen through APIs. And, crucially, all other forms of data exchange are prohibited.
This approach achieves two key goals.
Firstly, it eliminates complexity and creates a golden source of truth.
There’s no need to track down data and clean it up so it’s fit for purpose. The data is stored in an easily-understood format and accessible via a well-defined, user-friendly API. Which means teams across the organisation have one readily available, trusted source of data.
Secondly, it breaks down silos. Instead of having pockets of data spread around the business — this is all too often the case, especially in large asset management firms — teams can access what they need when they need it from one place, regardless of where it came from, who owns it, and where it’s stored.
Lesson 2: Contextualising your data expands its potential use cases
A single source of complete, accurate, easily-accessible data is a crucial first step in the right direction. But, equally important, you need to put that data in context.
Amazon excels at this.
The vast amount of data the company collects about customers helps it build a 360 view of them as individuals. The personalised recommendations the data produces are so uncannily accurate they’re thought to account for as much as 35% of Amazon’s annual sales.
Amazon also uses data to identify sellers’ needs and create value-added business-to-business products that address them, such as their suite of brand protection services.
In asset management, putting data in context can also provide enormous value when it comes to building out new services and developing new ways to improve customer engagement.
As Accenture notes, creating a 360 view of investors boosts acquisition, retention, and cross-selling opportunities.
Similarly, as interest in sustainable investments booms, firms can gain a leg up on their competitors by presenting ESG data to retail investors in a way that empowers them to make sensible decisions.
Lesson 3: People and processes should come before tech
An all too common mistake in digital transformation projects is placing too much focus on technology.
This sounds counter-intuitive. But while technology is instrumental in enhancing your data management capabilities, it’s a means to an end, not the end in itself. The more important elements of the project are the goals you’ve set yourself and the people you’ve engaged to help you achieve them.
As Amazon’s data manifesto bluntly puts it, ‘[Technology] doesn’t matter…‘. It’s what you do with the technology that counts.
So where do you start when overhauling your data management capabilities?
Here are our recommendations:
Work with specialists, not generalists
While big firms with a broad range of capabilities can seem like a more attractive proposition, you may be better off working with a specialist.
Specialists bring deep understanding and know-how of a particular area, so they can home in on your core issue and build a tailor-made solution.
Sort out vendor onboarding as soon as possible
Onboarding procedures can result in significant delays, so it pays to get it out of the way quickly.
It’s worth walking the vendor through the requirements ahead of time. This will avoid surprises that could push back the project start date.
A new project is an opportunity to start with a clean slate. But as tempting as it is to implement radical changes, this isn’t advisable when you’re trying to put new data processes in place.
At Fundipedia, we typically recommend starting out with a tight scope and a short implementation timeline – not more than 16 weeks. This helps you build rapport with your technology partner and boost confidence by scoring some early wins.
If you’ve chosen the right platform, you can always re-think, consolidate, or tweak processes at a later stage, once you’ve implemented the initial project successfully.
Involve stakeholders early on
Do you need the input of other vendors or service providers, aside from your technology vendor? Involving them too late in the process could result in delays, especially if they rely on manual processes themselves.
It’s also worth getting end-users input at an early stage.
The success or failure of any digital transformation project rests on whether your people will actually use it. So it pays to get their feedback about whether the technology you’ve chosen meets their practical requirements.
Lesson 4: Inter-organisational cooperation is beneficial for everyone
Whether it’s fear of missing out or anxiety around losing some kind of first-mover advantage, asset managers aren’t generally keen on collaboration. So the Amazon data manifesto’s suggestion that ‘…service interfaces, without exception, must be designed from the ground up to be externalizable…‘ can seem like a big ask.
But the truth is that, far from putting firms in a position where they’re giving away some secret sauce, collaboration has several compelling benefits.
Collaboration increases customer trust and makes it easier to identify and exploit new market opportunities. It also has the potential to improve data quality across the board by standardising the rules for data integrity.
Most significantly, collaboration makes it easier to resolve common issues.
Asset management firms broadly have the same data challenges. Why spend precious time, money, and effort trying to reinvent the wheel when you can learn from and build on what others have done?
Lesson 5: Put strict data governance controls in place
If Amazon’s approach to data sets the standard asset managers should aspire to, it’s not entirely perfect. Case in point, in 2020, a Wall Street Journal investigation found that Amazon used data about its sellers to develop competing private label products.
Amazon strenuously denies these claims. But regardless of whether they’re founded or not, their existence proves that great power comes with great responsibility.
Investing in technology and data infrastructure is at the top of the list of asset managers’ priorities in 2022. As firms’ data capabilities and, in turn, their data literacy increases, it will become more and more important to have controls in place that prevent that data from being misused.
One of the key ways firms can put checks and balances in place is by automating data governance.
Data management platforms like Fundipedia enable you to control who can access what data, and create an audit trail that shows you how it’s used across the business. They also eliminate the need for data to be manually verified and reconciled, which solves the issue of lack of adequate data oversight resources.
More to the point, you should ensure your technology partners have robust security in place.
Fundipedia, for instance, is ISO 27001:2013-certified and uses full-disk 256-bit encryption — the same standard the government uses to encrypt top-secret documents. We also conduct regular, independent penetration testing so we can identify and fix any vulnerabilities before anyone can exploit them.
Asset managers should ‘be more Amazon’
Asset managers are increasingly reliant on data. But because many firms use outdated data management processes, they’re not even close to unlocking its full potential.
It’s high time the industry as a whole rethought its approach.
Standardisation, collaboration, and partnering with the right technology vendor will help firms improve data accuracy and lower costs while using resources more efficiently.
But, most important of all, better data management will enable firms to make the most of new opportunities and deliver the level of service investors have come to expect in 2022.
Want to learn more about how Fundipedia can help you adopt a 21st-century approach to data management?