Deakin discusses big data and social analytics.
Companies embarking on a big-data journey must realise they are engaging in a value exchange with their customers, one that requires them to tailor information to make a website or product more useful or interesting, while remaining vigilant about crossing a line when it comes to privacy or ethics violations.
That was the view of the panellists at The Australian and Deakin Business School Big Data and Social Analytics event attended by data scientists and entrepreneurs from some of Australia’s leading tech firms.
The recent scandal around Facebook’s data release to third-party platforms such as Cambridge Analytica was high on the panellists’ minds, particularly given the $US120 billion ($162bn) fall in Facebook’s market value as customers deserted the platform, partly over privacy concerns.
Emma Lo Russo, chief executive and founder of social analytics platform Digivizer, said that while the Facebook scandal was something of a wake-up call in terms of how easily a company could be punished for not jealously protecting customer data, the genie was out of the bottle in terms of how readily people were sharing their personal data and how powerful a tool it is in the hands of advertising companies such as Facebook.
“If I had a lot of money, I would be putting it into Facebook right now,” Lo Russo said. “Whether it is GDPR (EU legislation requiring companies to obtain consent on data collection from consumers) or Cambridge Analytica, just think about permission. If your customer is happy to give you their data — and we can tell you they are and they do — there hasn’t been big changes in terms of people sharing and how they’re using the platforms.”
Lo Russo said that Facebook and its other social media platforms, Instagram and WhatsApp, remained popular and a place where brands needed to be.
“You absolutely cannot be a successful brand without a social media strategy,” Lo Russo said. “You don’t have to be on social but your customers will be and they will be talking about you. Facebook has got 17 million Australians on it. If you’re not there, you can’t defend, engage and do something with it.
“People are prepared to share their data if you do something great with it. They want to be entertained or infotained, or have their life made better or more meaningful, and that’s your job.”
Adrian Letchford, senior data scientist at peer-to-peer job platform Airtasker, said that making people’s lives easier through data was the goal of every company using big data, but that pitfalls lay in how people perceived that data was being used, especially given that data was becoming ubiquitous across the internet.
Big data uses algorithms to crunch numbers for an optimal outcome, but with companies using that data for profit, the outcomes may be at odds with social values, Letchford said.
“The biggest problem at the moment, really, is discrimination because we’re able to look into your data and discriminate you from other people. One big issue that came up a couple of years ago in the UK is the government banned insurers from discriminating based on gender. I think they were charging men more for car insurance because they would have massive accidents when they’re speeding through the highway.
“A year or so later, a friend of mine, an economist, did an analysis of all the insurers’ prices and they were still discriminating. They deleted the gender variable and they were just doing it based on jobs. So maybe mining jobs tend to be more male. Maybe nursing jobs tend to be more female. Then they were changing their insurance based on this.
“That, for me, is the biggest issue because where do we draw the line there? What happens when you start building AI bots to hire or fire people and then we start discriminating simply because the machine says so?”
Kshira Saagar, head of analytics at clothing retailer The Iconic, said the company spent its efforts trying to work out not who someone was, but instead what kind of behaviours they demonstrated. He said that beyond a customer’s email address and delivery location, they didn’t have much data on gender or age, and the role of the algorithm was to make inferences to customise the shopping experience.
“We don’t want to treat you based on who you are, but how you behave,” Saagar said.
“You could be an 85-year-old grandma and shop like a 20-year-old kid. We’ll treat you like a 20-year-old kid because that’s what you’re expressing yourself as, and that’s your intent and behaviour.”
Saagar did admit that the ultimate goal for marketers could easily cross the line between helpful and intrusive.
“There’s still an ethical debate on how do we use this for marketing. For example, we can have an algorithm that can predict what are they doing, when are they shopping — but what we’ll do with it, that’s the next question. So the idea is you can know whatever you want, but we don’t want to go (to) extremes.”
Saagar said that the law would catch up with technology, just as it had done throughout history. He brought up the example of credit cards, noting that despite concerns about credit-card theft, people found them far more convenient than using cash.
“That’s the same thing that has happened with online,” he said.
“We get in there and people say, ‘What happens if everybody steals it?’ Somebody will come up with laws and ways to protect them. If you don’t want to go there, you will not be able to do anything.”
“The only companies that will succeed in the future will be the ones that are genuinely investing in technology and experiences for their customers that are very immersive”
For Digivizer’s Lo Russo, consumers themselves may move faster than the law, and in the digital age a mistake can be far more costly than a legal punishment.
“The consumer is more empowered; they expect better experiences; they want you to know them; they want you to know what they care about,” she said.
“You will learn that through behaviour, and so the only companies that will succeed in the future will be the ones that are genuinely investing in technology and experiences for their customers that are very immersive,” she added.
“Your values are actually going to become more critical than anything else.
“Your company will need to be really guided by that and share that with your customers, because of their ability to move and vote very quickly, with the flick of their wrist.”
—
Originally published on The Australian.