Jo eh.
Go to file
Mario Zechner 16958db8ce -s flag for server to skip data fetching 2023-09-19 21:57:09 +02:00
.github/workflows Print date in workflow 2023-06-01 19:45:14 +02:00
.husky Remove husky from package.json, add pre-commit hook to .husky instead. 2023-06-03 00:41:04 +02:00
.vscode WIP esbuild, bundling, components. 2023-06-08 13:48:08 +02:00
docker Enable cors for everything. 2023-08-04 14:16:28 +02:00
locales -s flag for server to skip data fetching 2023-09-19 21:57:09 +02:00
site Merge pull request #149 from Darkyenus/localization 2023-09-19 20:55:50 +02:00
stores Closes #134, filter out 0 prices. Bad data is bad. 2023-07-11 15:36:40 +02:00
.editorconfig Add prettier 2023-06-02 07:34:32 +02:00
.gitignore patterns.js generates various pattern analysis results. Currently matches Billa products with Spar products and spits out some files to be imported as carts. 2023-06-25 23:52:39 +02:00
.prettierignore Remove old entries for discount only stores. Closes #102 2023-06-21 16:00:59 +02:00
.prettierrc Increased maxWidth to 150 in prettier config, formatted all the things. See #52. 2023-06-02 16:45:54 +02:00
LICENSE Initial commit 2023-05-08 16:51:43 +02:00 fix README for production deployment 2023-09-17 08:48:33 +02:00
analysis.js Closes #135 2023-07-11 15:31:02 +02:00
bundle.js Localization fixes 2023-09-17 19:48:29 +02:00
categorize.js Clean-up, initial Hofer mapping. 2023-06-21 02:43:04 +02:00 Category files may change, git checkout before deployment. 2023-06-23 23:01:53 +02:00
h43z.js Binary format optimization 4.4mb -> 3.9mb, don't store urls were not needed, use product-id instead of code-internal for spar items, 2023-06-17 01:11:21 +02:00
i18n.js Localization fixes 2023-09-17 19:48:29 +02:00
migrate.js Refactor migration, switch from gzip to brotli compression. See #44 2023-06-03 00:01:41 +02:00
package-lock.json -s flag for server to skip data fetching 2023-09-19 21:57:09 +02:00
package.json Merge pull request #146 from xsuchy/readme-prod 2023-09-18 21:01:22 +02:00
pages.js Final answer, esbuild hung... 2023-06-15 22:44:49 +02:00 Ignore docs/ in prettier config 2023-06-15 23:26:44 +02:00
patterns.js Preisgesenkt Billa list parsing and matching. 2023-06-27 14:48:56 +02:00
postcss.config.js add tailwind build process 2023-06-07 15:00:15 +02:00
restore.js Hofer and MPREIS categories. 2023-06-21 15:20:28 +02:00
server.js -s flag for server to skip data fetching 2023-09-19 21:57:09 +02:00 Fix Billa weighted item prices analogously to Spar. See #10 2023-05-20 15:38:40 +02:00
tailwind.config.js Add mobile table headers, fixes badlogic/heissepreise#89 2023-06-24 19:38:01 +02:00

Heisse Preise

A terrible grocery price search "app". Fetches data from big Austrian grocery chains daily and lets you search them. See

The project consists of a trivial NodeJS Express server responsible for fetching the product data, massaging it, and serving it to the front end (see server.js). The front end is a least-effort vanilla HTML/JS app (see sources in site/).


  • Node.js



Install NodeJS, then run this in a shell of your choice.

git clone
cd heissepreise
mkdir -p data
npm install
npm run dev

The first time you run this, the data needs to be fetched from the stores. You should see log out put like this.

Fetching data for date: 2023-05-23
Fetched LIDL data, took 0.77065160000324 seconds
Fetched MPREIS data, took 13.822936070203781 seconds
Fetched SPAR data, took 17.865891209602356 seconds
Fetched BILLA data, took 52.95784649944306 seconds
Fetched HOFER data, took 64.83968291568756 seconds
Fetched DM data, took 438.77065160000324 seconds
Merged price history
App listening on port 3000

Once the app is listening per default on port 3000, open http://localhost:3000 in your browser.

Subsequent starts will fetch the data asynchronously, so you can start working immediately.


Install the dependencies as per above, then simply run:

git clone
cd heissepreise
node --dns-result-order=ipv4first /usr/bin/npm install --omit=dev
npm run start

Once the app is listening per default on port 3000, open http://localhost:3000 in your browser.

Using data from

You can also get the raw data. The raw data is returned as a JSON array of items. An item has the following fields:

  • store: (billa, spar, hofer, dm, lidl, mpreis, ...)
  • name: the product name.
  • price: the current price in €.
  • priceHistory: an array of { date: "yyyy-mm-dd", price: number } objects, sorted in descending order of date.
  • unit: unit the product is sold at. May be undefined.
  • quantity: quantity the product is sold at for the given price
  • bio: whether this product is classified as organic/"Bio"

If you run the project locally, you can use the data from the live site including the historical data as follows:

cd heisse-preise
rm data/latest-canonical.*
curl -o data/latest-canonical.json

Restart the server with either npm run dev or npm run start.

Historical Data Credits

The live site at feature historical data from: