No prisoners: e-Commerce Uses Game Theory to Capture Consumer Share of Wallet thanks to Nash Equilibrium

It is that time of year again. Christmas shopping must be done and deals are flying in all consumers inboxes. Whilst reading ‘The Bargaining Problem‘ by one of my favourite Mathematician/economist, it got me thinking how we use the Game Theory on a regular basis in eCommerce reflecting on my Christmas shopping behaviour (both as a consumer and retailer). Indeed, the rise and rapid proliferation of technology has forced companies to adapt in order to stay relevant and competitive. Specifically, the consumer retail industry has recently navigated this changing landscape by utilising collected consumer data to make prices dynamic depending on seasonal factors, locations (international businesses) and even the strategies of their competitors. These companies are trying to harness the power of technology to achieve perfect price discrimination, where the seller knows every buyer’s willingness to pay and can therefore maximise retail prices without exceeding the buyer’s walk away price.

John Nash and the Equilibrium Point

John Forbes Nash Jr. (June 13, 1928 – May 23, 2015) was an American mathematician who made fundamental contributions to game theory, differential geometry, and the study of partial differential equations. Nash’s work has provided insight into the factors that govern chance and decision-making inside complex systems found in everyday life.

His theories are widely used in economics. Serving as a Senior Research Mathematician at Princeton University during the latter part of his life, he shared the 1994 Nobel Memorial Prize in Economic Sciences with game theorists Reinhard Selten and John Harsanyi. In 2015, he also shared the Abel Prize with Louis Nirenberg for his work on nonlinear partial differential equations.

John Nash is the only person to be awarded both the Nobel Memorial Prize in Economic Sciences and the Abel Prize.

Game theorists use the Nash equilibrium concept to analyse the outcome of the strategic interaction of several decision makers. In other words, it provides a way of predicting what will happen if several people or several institutions are making decisions at the same time, and if the outcome depends on the decisions of the others. The simple insight underlying John Nash’s idea is that one cannot predict the result of the choices of multiple decision makers if one analyses those decisions in isolation. Instead, one must ask what each player would do, taking into account the decision-making of the others.

 

Game theory applied to eCommerce

In the context of e-Commerce, the retailer and the consumer are the two players. A game theory graph would illustrate the company’s best response to the consumer’s willingness to pay and the consumer’s response to the retail price the company is offering for the product. There would be no dominant strategy in this game because of the availability of information about other sellers and the access to other sellers the customer has in addition to the connectivity to millions of buyers via the Internet the company has. Either party can forego engaging in this transaction and just find another customer/company to sell/buy to/from. This can be illustrated with the given example (I read couple of articles whilst researching) of how Amazon dropped prices on Black Friday of a Samsung TV from $350 to $250 and decided on this final price using collected data, which allowed them to surpass the competition. Amazon took this a step further by hiking the price of HDMI cables, a complementary product, knowing based on consumer data that people are less likely to shop around in pursuit of the lowest prices for smaller items than bigger ticket items. The customers’ willingness to pay was any price lower than what the competitors were offering, which turned out to be the $250 (implying that no other retailer offered prices that low).

The implications of this show how companies are controlling not only prices but consumers’ perception of prices, thereby using this data to surpass the competition by limiting how consumers perceive the choices they have in front of them. Furthermore, this serves as an illustration of another way we experience a loss of control in our lives. However, another side of the argument says that perfect price discrimination can positively impact consumers, since their prices will be individually tailored. At the same time, this constant changing of prices can end up overwhelming and deterring consumers from purchasing the item all together.

As of now, it is too soon to tell how and if this becomes a normalised practice that we all must adapt to. It definitely is interesting to consider the ripple effects it will have on consumer behaviour in the future in addition to potential consumer protection regulation, as we are operating within a rigged sandbox where companies hold all the cards in their favour through informational advantages.

Recommended reading: http://file.scirp.org/pdf/IB_2014122310482155.pdf

Happy Christmas Shopping and bargain hunting

Benoit Mercier

Advertisement

Speed is everything

When you launch a website, there are many factors that must be optimised, like checkout, but speed is the one you should really focus on. At the end of the day, if your website is optimised on all key conversion pages but your website is slow then it is wasted effort, believe me. Can the speed of your website really have that much of an effect on your conversion rate? Even if your site isn’t loading too slowly, can it still be improved?

According to surveys done by Akamai and Gomez.com, nearly half of web users expect a site to load in 2 seconds or less, and they tend to abandon a site that isn’t loaded within 3 seconds. 79% of web shoppers who have trouble with web site performance say they won’t return to the site to buy again and around 44% of them would tell a friend if they had a poor experience shopping online.

This means you’re not just losing conversions from visitors currently on your site, but that loss is magnified to their friends and colleagues as well. The end result –lots of potential sales down the drain because of a few seconds difference.

How fast should be my website?

This is a question I get asked a lot. I have naively always thrived for the magical number of 2s but let me put it out there: ‘Not possible’! I know the stats and the graphs but for a retailer this number is fantasy. So what should it be?

Well in my opinion, you should do two things. Firstly, make sure that you are faster than your previous year. if you build a new website, it is not to be slower. There is nothing wrong to compare YoY. However, remember that it might not be LFL because you have changed the amount of content on a specific web page (i.e. larger images on your product detail page). Secondly, you should benchmark your key competitors. At the end of the day if you are faster than them then you can gain a competitive advantage.

Speed and google

Let me clear this one out. A slow website doesn’t currently harm your organic ranking. Will it change? The speed of your mobile pages currently doesn’t impact your mobile rankings, but soon it may, says Gary Illyes of Google. Read this article.

What tools to use?

There are plenty out there. For me, the best free ones are webtestpage and GA. I use them on a weekly basis and find them incredibly rich in insight. I know a lot of web developers use Lighthouse but for me it is rubbish. You can run the same tests 10 times in a row and you get huge variations.

Also, if you can afford it, find an excellent monitoring partner. I use NCC and they are fantastic. This is their area of expertise and they have some excellent tools. One that I am currently testing is their RUM tool that can predict how much cash you are loosing or potentially could gain if you were going to increase speed.

How to measure speed?

Speed Index. Yes you need to monitor your speed index on a monthly basis if not weekly if you have just replatformed. Read more.

There are all the KPIs you must look at:

  • First BYTE – The First Byte time is the time from when the user started navigating to the page until the first bit of the server response arrived.  The bulk of this time is usually referred to the “back-end time” and is the amount of time the server spent building the page for the user.
  • Render start – The Start Render time is the first point in time that something was displayed to the screen.  Before this point in time the user was staring at a blank page.  This does not necessarily mean the user saw the page content, it could just be something as simple as a background color but it is the first indication of something happening for the user.
  • DOM load – DOM ready means that all the HTML has been received and parsed by the browser into the DOM tree which can now be manipulated. It occurs before the page has been fully rendered (as external resources may have not yet fully downloaded – including images, CSS, JavaScript and any other linked resources)
  • Visually complete – Visually Complete measures how long it takes to display the content visible in the user’s browser: content “below the fold” and non-visual content (like third-party tracking beacons) is excluded.
  • Fully loaded – The metrics grouped together under the Fully Loaded heading are the metrics collected up until there was 2 seconds of no network activity after Document Complete.  This will usually include any activity that is triggered by javascript after the main page loads
  • Page weight – this is how heavy your page is with all the content

and then leave all other metrics to your web developers. Optimise all of the KPIs above and your CR will start to improve. How much will CR improve depends on how bad your website is.

Also remember to tests in different environments. For example, how fast is your website in a slow 3G environment vs. a 5mbps WIFI? Also adjust latency.

Conclusion

Ignore website speed at your own peril as the CR gains are potentially huge. You haven’t got time then appoint a partner. If you have invested in a Ferrari but it is as fast as a Skoda then you are burning cash away! Monitoring takes no more than an hour when you select key pages (i.e. homepage, category landing page, product detail page) and will give you a great set of focus with your web developers. Good luck.

Benoit Mercier

Google begins mobile-first indexing, using mobile content for all search rankings

Google has begun testing its mobile-first index, which will primarily look at the mobile version of your website for its ranking signals and fall back on the desktop version when there is no mobile version.

Businesses that have made the move to have their platform responsive will be delighted to read the below. If you haven’t then you need to ask yourself whether you should 1) go responsive or 2) recruit more people to ensure that all your content is available on your mobile site.

Most of Google searches are mobile, but Google’s index is desktop

Google explained that it sees more mobile searches than desktop searches on a daily basis. But when Google looks to evaluate a page’s ranking in Google, it currently looks at the desktop version of the site — an issue we pointed out over a year ago. To fix this, Google will look at the content, links and structured data of the mobile version of your site if one is available.

Google wrote:

To make our results more useful, we’ve begun experiments to make our index mobile-first. Although our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site, to understand structured data, and to show snippets from those pages in our results. Of course, while our index will be built from mobile documents, we’re going to continue to build a great search experience for all users, whether they come from mobile or desktop devices.

With this change, Google will primarily index mobile content and use that to decide how to rank its results, regardless of whether you’re on desktop or mobile. There will no longer be any type of “mobile-friendly” adjustment done just for mobile users. Effectively, if you’re not mobile-friendly, that will have an impact even on how you appear for desktop searchers

Google is testing this but hopes to roll it out to all

Google said it has started this experiment and will “continue to carefully experiment over the coming months on a small scale.” Google will “ramp up this change when we’re confident that we have a great user experience.”

No mobile site? Don’t worry

Those who do not have a mobile version of their website do not need to worry. Google will just use the desktop version to rank the site. Google wrote, “[I]f you only have a desktop site, we’ll continue to index your desktop site just fine, even if we’re using a mobile user agent to view your site.” This also means that if you have a responsive site, one that dynamically changes content depending on desktop or mobile device, there’s nothing special you need to do.

Of course, if you do not have a mobile site, you won’t benefit from the mobile-friendly ranking boost. But that is separate from this mobile index news.

How can you prepare?

Here are some recommendations Google is giving webmasters to prepare for the change:

  • If you have a responsive site or a dynamic serving site where the primary content and markup is equivalent across mobile and desktop, you shouldn’t have to change anything.
  • If you have a site configuration where the primary content and markup is different across mobile and desktop, you should consider making some changes to your site.
      • Make sure to serve structured markup for both the desktop and mobile version. Sites can verify the equivalence of their structured markup across desktop and mobile by typing the URLs of both versions into the Structured Data Testing Tool and comparing the output.
      • When adding structured data to a mobile site, avoid adding large amounts of markup that isn’t relevant to the specific information content of each document.
      • Use the robots.txt testing tool to verify that your mobile version is accessible to Googlebot.
      • Sites do not have to make changes to their canonical links; we’ll continue to use these links as guides to serve the appropriate results to a user searching on desktop or mobile.
  • If you are a site owner who has only verified your desktop site in Search Console, please add and verify your mobile version.

Good luck

Benoit Mercier

Check your returns policy

An interesting research this am about returns

17 stores ‘misled online shoppers about legal rights’. The Metro reports that a study by MoneySavingExpert.com has found that 17 retailers were misleading shoppers about their online returns policies. The retailers either “hid” the correct policy or were displaying a policy which did not comply with the law, which was changed 20 months ago to allow consumers 28 days to cancel and return an order. 13 of the retailers highlighted in the report – including JD Sports, New Look and Next – have said they will review their policies or make changes to their online stores. The other four have been reported by MoneySavingExpert.com to trading standards.

Benoit Mercier

The importance of Big Data

I read an interesting article by Louis Colombus in Forbes on how big data is in the top 5 most disruptive innovations. Based on his research, he quoted that

  • 47% of manufacturers expect big data analytics to have a major impact on company performance making it core to the future of digital factories.
  • 36% expect mobile technologies and applications to improve their company’s financial performance today and in the future.
  • 49% expect advanced analytics to reduce operational costs and utilise assets efficiently

Working in the ecommerce sphere, I tend to agree with Louis’s view and here are the reasons why.

“You can’t manage what you can’t measure”

There’s much wisdom in that saying, which has been attributed to both W. Edwards Deming and Peter Drucker, and it explains why the recent explosion of digital data is so important. Simply put, because of big data, managers can measure, and hence know, radically more about their businesses, and directly translate that knowledge into improved decision making and performance.

The familiarity of the Amazon story almost masks its power. We expect companies that were born digital to accomplish things that business executives could only dream of a generation or few years ago. But in fact the use of big data has the potential to transform traditional businesses as well. It may offer them even greater opportunities for competitive advantage (online businesses have always known that they were competing on how well they understood their data). As we’ll discuss in more detail, the big data of this revolution is far more powerful than the analytics that were used in the past. We can measure and therefore manage more precisely than ever before. We can make better predictions and smarter decisions. We can target more-effective interventions, and can do so in areas that so far have been dominated by gut and intuition rather than by data and rigor.

An HBR article written by Andrew McAfee and Erik Brynjolfsson states that as the tools and philosophies of big data spread, they will change long-standing ideas about the value of experience, the nature of expertise, and the practice of management. Smart leaders across industries will see using big data for what it is: a management revolution. But as with any other major change in business, the challenges of becoming a big data–enabled organization can be enormous and require hands-on—or in some cases hands-off—leadership. Nevertheless, it’s a transition that executives need to engage with today.

1. What is big data analytics?

According to SAS, big data analytics is the process of examining big data to uncover hidden patterns, unknown correlations and other useful information that can be used to make better decisions. With big data analytics, data scientists and others can analyse huge volumes of data that conventional analytics and business intelligence solutions can’t touch.

I have not worked in a business that is not obsessed with analysing data, whether it is customer data, web data, infrastructure data etc. In fact i know that most businesses do because data analyst are very hard to find, and if you are 16 or 18 years old with a good math degree I would seriously consider adventuring myself into this type of role!

2. What has changed in the past 3 years?

Volume

As of 2012, about 2.5 exabytes of data are created each day, and that number is doubling every 40 months or so. More data cross the internet every second than were stored in the entire internet just 20 years ago. This gives companies an opportunity to work with many petabyes of data in a single data set—and not just from the internet. For instance, it is estimated that Walmart collects more than 2.5 petabytes of data every hour from its customer transactions. A petabyte is one quadrillion bytes, or the equivalent of about 20 million filing cabinets’ worth of text. An exabyte is 1,000 times that amount, or one billion gigabytes.

Velocity

For many applications, the speed of data creation is even more important than the volume. Real-time or nearly real-time information makes it possible for a company to be much more agile than its competitors. Now this is where I feel even more progress will be made. Our systems and human nature is to get things faster and faster and faster!

Variety

Big data takes the form of messages, updates, and images posted to social networks; readings from sensors; GPS signals from cell phones, and more. Many of the most important sources of big data are relatively new. The structured databases that stored most corporate information until recently are ill suited to storing and processing big data. At the same time, the steadily declining costs of all the elements of computing—storage, memory, processing, bandwidth, and so on—mean that previously expensive data-intensive approaches are quickly becoming economical.

As more and more business activity is digitised, new sources of information and ever-cheaper equipment combine to bring us into a new era: one in which large amounts of digital information exist on virtually any topic of interest to a business. Mobile phones, online shopping, social networks, electronic communication, GPS, and instrumented machinery all produce torrents of data as a by-product of their ordinary operations. Each of us is now a walking data generator. I work in retail and we are not at the forefront of new technology but we are getting there and beacons are an example of that.

3. Why is big data important? Benefits and challenges

A report from McKinsey Global Institute estimates that Big Data could generate an additional $3 trillion in value every year in just seven industries. Of this, $1.3 trillion would benefit the United States. The report also estimated that over half of this value would go to customers in forms such as fewer traffic jams, easier price comparisons, and better matching between educational institutions and students. Note that some of these benefits do not affect GDP or personal income as we measure them. They do, however, imply a better quality of life.

Out of 100s of ideas, McKinsey believes big data analytics is one of the top 5 catalysts that can increase US productivity and raise thee GDP in the next 7 years. For the retail sector, big data applications covered three areas—supply chain, operations, and merchandising. By creating greater performance transparency, these companies can optimize inventory, transportation, returns, labor, assortments, and more. They estimate that this sector will gain $30-55 billion in GDP through use of big data. In our previous article on 20+ big data examples, we provided links to stories about how Walmart, Sears, Kmart, and Amazon are using big data. McKinsey’s quote that will make my CFO and CEO listen is 60% potential increase in retailers’ operating margins possible with Big Data.

5 key benefits of big data:

1. Big Data can unlock significant value by making information transparent. There is still a significant amount of information that is not yet captured in digital form, e.g., data that are on paper, or not made easily accessible and searchable through networks. We found that up to 25 percent of the effort in some knowledge worker workgroups consists of searching for data and then transferring them to another (sometimes virtual) location. This effort represents a significant source of inefficiency.

2. As organisations create and store more transactional data in digital form, they can collect more accurate and detailed performance information on everything from product inventories to sick days and therefore expose variability and boost performance. In fact, some leading companies are using their ability to collect and analyse big data to conduct controlled experiments to make better management decisions.

3. Big Data allows ever-narrower segmentation of customers and therefore much more precisely tailored products or services.

4. Sophisticated analytics can substantially improve decision-making, minimise risks, and unearth valuable insights that would otherwise remain hidden.

5. Big Data can be used to develop the next generation of products and services. For instance, manufacturers are using data obtained from sensors embedded in products to create innovative after-sales service offerings such as proactive maintenance to avoid failures in new products.

However, not all is that simple and McAfee and Brynjolfsson identified 5 key challenges to big data, which are:

1. Leadership: Companies succeed in the big data era not simply because they have more or better data, but because they have leadership teams that set clear goals, define what success looks like, and ask the right questions. Big data’s power does not erase the need for vision or human insight. On the contrary, we still must have business leaders who can spot a great opportunity, understand how a market is developing, think creatively and propose truly novel offerings, articulate a compelling vision, persuade people to embrace it and work hard to realize it, and deal effectively with customers, employees, stockholders, and other stakeholders. The successful companies of the next decade will be the ones whose leaders can do all that while changing the way their organisations make many decisions.

2. Talent Management: As data become cheaper, the complements to data become more valuable. Some of the most crucial of these are data scientists and other professionals skilled at working with large quantities of information. Statistics are important, but many of the key techniques for using big data are rarely taught in traditional statistics courses. Perhaps even more important are skills in cleaning and organizing large data sets; the new kinds of data rarely come in structured formats. Visualization tools and techniques are also increasing in value. Along with the data scientists, a new generation of computer scientists are bringing to bear techniques for working with very large data sets. Expertise in the design of experiments can help cross the gap between correlation and causation.

3. Technology: The tools available to handle the volume, velocity, and variety of big data have improved greatly in recent years. In general, these technologies are not prohibitively expensive, and much of the software is open source. Hadoop, the most commonly used framework, combines commodity hardware with open-source software. It takes incoming streams of data and distributes them onto cheap disks; it also provides tools for analyzing the data. However, these technologies do require a skill set that is new to most IT departments, which will need to work hard to integrate all the relevant internal and external sources of data. Although attention to technology isn’t sufficient, it is always a necessary component of a big data strategy.

4. Decision making: An effective organisation puts information and the relevant decision rights in the same location. In the big data era, information is created and transferred, and expertise is often not where it used to be. The artful leader will create an organization flexible enough to minimize the “not invented here” syndrome and maximize cross-functional cooperation. People who understand the problems need to be brought together with the right data, but also with the people who have problem-solving techniques that can effectively exploit them.

5. Company culture: The first question a data-driven organisation asks itself is not “What do we think?” but “What do we know?” This requires a move away from acting solely on hunches and instinct. It also requires breaking a bad habit we’ve noticed in many organizations: pretending to be more data-driven than they actually are. Too often, we saw executives who spiced up their reports with lots of data that supported decisions they had already made using the traditional HiPPO approach. Only afterward were underlings dispatched to find the numbers that would justify the decision.Without question, many barriers to success remain. There are too few data scientists to go around. The technologies are new and in some cases exotic. It’s too easy to mistake correlation for causation and to find misleading patterns in the data. The cultural challenges are enormous, and, of course, privacy concerns are only going to become more significant. But the underlying trends, both in the technology and in the business payoff, are unmistakable.

Convinced that Big Data should be part of your business strategy for the next 5 years? If not, you might be heading down the well and your business with it!

Benoit Mercier