When selling products that have a relatively short shelf life, it becomes necessary to reduce the price on items as they approach their use by date so they sell and aren't thrown out. This is done for several reasons.

- Sending plastic packaging and food that's safe to eat to landfill is a waste of resources, and in 2019 we can do a lot better than that
- If a business can reduce its waste, they can save money on garbage collection
- It's an acknowledgement to customers that the product isn't as flexible as one with a longer use by date
- It adds a little excitement for customers if they can buy a product that they normally wouldn't buy because it's normally too expensive

In my experience, supermarkets perform markdowns in an incremental way to make sure that stock isn't wasted, and to get as much money for the product that is possible. This was usually accomplished by reducing the price by 20% on the 2nd last day of sale, 30% on the morning of the last day of sale, 40% at lunch time on the last day of sale and then going 50%, 60%, 70%, 80%, and 90% until the close of the store. In essence, it's a Dutch auction. This works well but has problems. People buy the most popular items first leaving an assortment of unappealing products towards the end. As fixed percentages are used across the board, it means you might be reducing some items more than you need to and some less than you need to. It's also reasonably labour intensive.

If you track markdowns and plot a graph of the most common ones you end up with a graph like the one below, which I'll use to demonstrate a point.

Markdowns shouldn't waste time or money |

In the graph above you can see that average markdown is around 50%. Traditionally you'd start with a 20% markdown, but the graph shows that in this case almost no one will buy the product at that discount. This is the "Waste of Time" section of the graph. Most likely you'll have to come back and do another markdown on the product. Conversely if you go straight to an 80% markdown you've wasted money because you mostly likely could have sold the product for more. The "Waste of Money" section. Ideally most of the markdowns should be in the 30% to 70% range, the "Sweet Spot". But how do you calculate the markdown percentages?

I'm going to demonstrate a method that doesn't necessarily generate optimum markdown percentages, but does generate percentages so that popular and unpopular products sell at roughly the same rate with a reasonable chance of selling on the first markdown.

Let's start with some Lamb Hearts, and to keep things simple, from most to least recent the markdown percentages are as follows, 98%, 66%, and 6%. This process uses a method called Kernel Density Estimation to generate a smooth curve of markdown probabilities. This means that at each of those percentages above you place a predefined shape, in this case a Gaussian distribution.

Initial placement of kernels |

You can see the distributions listed above. You may notice that because the grey and blue curves are truncated at the sides of the graph the area under them is less. This means that they'll have less of an influence on the result. This can be corrected with a simple scaling process though.

Compensation for truncated kernels |

The areas under the graphs above are now all the same, but is this what we want? Most recent markdowns should have a greater impact on the process than ones that were performed long ago. To correct this, each peak is weighted by 3 for the most recent, 2 for the next most recent, and 1 for the oldest markdown.

Weighting the kernels for recency |

Adding each of these graphs gives the final markdown distribution in yellow.

Summation of the kernels |

Now we perform a process called integration to get the light blue line. For those of you unfamiliar integration it basically records the area under a graph. For example at the 20 point on the bottom axis the blue line records the area under the yellow line up until that point.

Integration of the distribution to produce the final curve |

The next step is to introduce the concept of a markdown level which ranges from 0 to 10. 0 corresponds to a 0% markdown and 10 corresponds to a 100% markdown, but in between, things are very different. As a demonstration we'll calculate the markdown percentages for markdown levels 2, 5, and 8. The process is quite simple. Find the levels on the left hand side of the graph and project them across to the blue curve and then down to the bottom of the graph. This gives percentages of 53, 80 and then 94.

Calculating markdowns percentages from the graph |

For another demonstration let's try something popular like chicken breast that has recent markdowns of 30%, 10%, and 20%. This leads to percentages of 10, 22, and 33.

Calculating markdowns percentages from the graph |

Like I said above, this isn't designed to generate an optimum markdown, it's designed to create a markdown specific to each product so that they'll sell at similar rates. In the demonstration above the situation may warrant a level 2 markdown. This means that a product that's hard to sell like lamb hearts gets a 53% markdown, while a popular item like chicken breast only gets a 10% markdown.

So a strategy may be to use a level 2 markdown on the second last day of sale, and then taper throughout the trading day from level 3 to 10 on the last day.

The demonstrations above have been quite simple and don't give a full picture of how powerful this method can be. Let's put together a complicated situation with fake data to test it.

A new flavour of sausages is introduced and we are unsure how they'll perform. They initially use an assumed prior that isn't mentioned above, but it quickly becomes obvious that they are a good seller and don't require large markdowns, with the first level 2 markdown being in the region of 15%. After 500 recorded markdowns a new similar cheaper line is introduced and the sales of the original line suffer. Before long the system calculates that the first markdown at level 2 needs to be about 55%

Animation of the markdown calculation process |

The animation above is generated from 500 random markdowns with an average of 20% and then 500 random markdowns with an average of 80%. The calculations take into account the last 100 markdown and are weighted from 1 to 100.

There are a couple important points to consider. Firstly the process uses a lot of statistical methods and equations but isn't really grounded in theory. It uses the properties of the methods to create a stable system that fits the requirements that are specified. The next thing to remember is that this process will follow the markdowns. If you start the markdown process at level 8 without even trying something less than a level 5. The system will generate larger and larger markdowns. A stabilising term can be added to prevent this.

It's not a perfect method but it's a lot better than some of the optimised markdown systems I've seen. I actually trialed this process a while ago(I have no deep access to the computers and had to do it manually) in my store and it's fascinating to see it converge on higher and lower markdowns based on the saleability of the product.