Quantcast
Channel: iTWire - Entertainment
Viewing all articles
Browse latest Browse all 4710

How to build and future-proof data management platforms

$
0
0
How to build and future-proof data management platforms

Interview with Kshira Saagar, Head of Analytics & Data Science, The Iconic

“From a data and analytics point of view, the challenges facing all the sectors are all about the same – they only differ in their lead time to actualisation.”  Kshira Saagar

By kyeling | February 18, 2018

Ahead of the upcoming Chief Data & Analytics Officer Sydney conference, we caught up with Kshira Saagar Head of Analytics & Data Science at The Iconic, one of Australia’s leading online retailers. He talked us through his top tips on how to build and future-proof data management platforms as well as key factors in building a data-driven culture.
 
Corinium: You’ve worked in the data science field across industries on both the vendor and client side and you’re currently with The Iconic. Are there any unique challenges you see facing the retail sector?

KS: From a data and analytics point of view, the challenges facing all the sectors are all about the same – they only differ in their lead time to actualisation. Problems of scale, data credibility and usefulness of the insights are the same across various industries. The retail and technological sectors have always been among the earliest ones to adopt innovation and new age solutions to reduce this lead time. This becomes all the more imperative for us as we seek to better serve our customers who constantly seek enhanced shopping experiences and by default make our colleague’s lives better by supercharging them with the right answers and tools to enable those shopping experiences.
 
Corinium: When thinking about future proofing your data platforms, what do you consider the main factors to take into account?

{loadposition peter}

KS: The future for any data-related initiative must be able to answer and accommodate three major components – scale, veracity and access.

Scale – the one thing that we can predict confidently about the future is that the size of the data and the input sources for data collection will definitely increase exponentially. This means a lot of current approaches and technologies to data warehousing from 15-20 years ago will no longer help us with the data needs of the future – leading us to rethink a data architecture based on the new age data solutions – for the next 5 years, if not more.

Veracity – with such a big volume and velocity of data, comes the problem of veracity – i.e. data credibility. Collecting and processing millions and millions of rows/tuples/columns of data in a day is all fine but it will all mean nothing if, at the end of it all, the data is deemed unreliable and incorrect. A lot of older data solutions encourage data collection over data verification. Performing real-time event-driven algorithmic data integrity checks sits outside these solutions or need to be over-engineered, which highlights the need for a new architecture that can not only collect, process and clean data but also provide the options to check for integrity and provide more confidence over the data collected.

Access – the biggest roadblock to a future-looking solution is that given the lack of maturity of the solutions in the market, making data collected and processed on these platforms accessible to a wider section of the company becomes a challenge – not only for the wider business folk, but also to the common analyst who’s not too happy about having to write complicated Scala code to be able to even start interacting with data. This makes it absolutely imperative to come up with an accessible-for-all last mile that integrates well with the future data platform.
 
Corinium: And what do you think the most common pitfalls to avoid are?

KS: The biggest pitfall for any data platform, be it the future or present would be separating the custodians of the data platform from the business, and giving them technical superiority of the platform as the mission. When that happens, it becomes more of a technical exercise in efficiency, optimisation and technical wizardry. This automatically leads to nullifying the two key components of a successful data platform as mentioned above – lack of data veracity and limited access at the right levels to the right people.

To be able to validate, accredit and approve that the data being collected is right needs a lot of business collaboration and input. For example, a very big data anomaly can be explained away by a business user with the right knowledge of what caused that anomaly using a very simple adjustment/rule. Moreover when the teams operate out of sync, getting access at the right levels to the right things becomes an issue – thanks in part to the nature of the setup, but also mostly to ensure the coming of age strict data security requirements which are amplified by being separate teams.
 
Corinium: One of your core responsibilities at The Iconic is enabling data-driven decision making. We often hear that analysing the data and uncovering the insight is the easy part, it’s getting the business to engage with and act on the insights that is the real challenge. What has your experience with this been?

KS: I tend to disagree with this assessment of how data-driven decision making happens. Any business user or otherwise is more than happy to make use of these insights “immediately” if they are convinced of the effectiveness of the analysis and it has the ability to answer all their questions. As long as the analysis/insight/tool is built in a way to be holistic in its thought process and can answer the ‘what will happen if…’ questions through irrefutable simulations – I don’t see any reason why it would be a challenge in the first place.

Example – there’s a piece of analysis that answers why the sales were higher last month compared to previous months. It is very well-detailed and has 50 graphs of all possible internal variables imaginable. The business asks a simple question, “What exactly should we do next month to replicate this success, i.e. what are top 5 actions from this analysis?” If the analysis can’t answer that question in tactical terms or if it had not thought about the 3 other external factors, no one is going to use it and that’s not the business’ fault.
 
Corinium: How are you currently thinking about AI and Machine Learning in terms of their potential to deliver benefits to your business and customers?

KS: AI, Machine Learning, Deep learning – these are all mere tools that can answer an existing problem with more clarity and better accuracy for the future. In real-world terms, it’s like using a calculator to do the math quickly and accurately, instead of using pen and paper. Decisions are being made using data every day by everyone around us. What these new technologies and tools enable us to do better is add an extra-edge to the solution – enable us to think of multiple scenarios and simulations at the press of a button, enable us to answer some very convoluted & complicated questions about behaviour and intent and most importantly enable us to scale our problem solving capabilities rapidly.
 
Corinium: What would your top tips be for anyone looking to build the technology and culture to enable data-driven decision making across their organisation?

KS: One crucial suggestion for companies investing in going deeper would be – to set up a less fragmented and more cohesive foundation for their data and analytics teams. That involves thinking about the data and analytics team as a horizontal, and not like a vertical tied to individual departments. Think of Data and Analytics as you would think of your Finance or Human Resources team – horizontal for the whole company.

Rather than having a separate Marketing Analytics department that churns out amazing insights into customer behaviour and another separate Product Analytics department that in turn generate their own deeper insights running counter to the aforementioned first team which is quite consuming in terms of time and energy – it pays to have all the Analysts and Data Scientists under one common umbrella but still deputed to work for individual teams. This is where cross-functional thinking plays a big role and cannot only unify data initiatives but also help in scaling them efficiently.


Viewing all articles
Browse latest Browse all 4710

Trending Articles