URL shortener, pastebin & file-sharing service, written from the ground up in C. https://r8r.be
 
 
 
 
 
Go to file
Jef Roosens 29f4edc059
chore(lander): fix Docker build
2023-11-14 15:34:07 +01:00
.woodpecker chore(ci): re-enable deploy 2023-08-03 21:44:56 +02:00
bench chore: add benchmarking 2023-05-31 20:39:06 +02:00
include chore: integrate cppcheck into workflow 2023-11-14 10:49:12 +01:00
lsm chore: integrate cppcheck into workflow 2023-11-14 10:49:12 +01:00
src chore: integrate cppcheck into workflow 2023-11-14 10:49:12 +01:00
test refactor: decouple trie into static library 2023-07-28 18:48:44 +02:00
thirdparty refactor: clean up the code to start development properly 2023-05-29 18:28:43 +02:00
trie chore: move TRIE.md; disable auto deployment 2023-07-28 21:13:23 +02:00
.dockerignore chore(lander): fix Docker build 2023-11-14 15:34:07 +01:00
.editorconfig feat: added randomly generated URLs 2022-11-21 12:03:16 +01:00
.gitignore feat(lander): integrate persistent insert & get lsm store 2023-11-08 11:19:33 +01:00
ARCHITECTURE.md docs: start architure file 2023-05-29 18:28:45 +02:00
CHANGELOG.md chore: update changelog & landerctl 2023-11-12 15:00:20 +01:00
Dockerfile chore(lander): fix Docker build 2023-11-14 15:34:07 +01:00
Makefile chore(lander): fix Docker build 2023-11-14 15:34:07 +01:00
README.md chore: add benchmarking 2023-05-31 20:39:06 +02:00
config.mk chore(lander): fix Docker build 2023-11-14 15:34:07 +01:00
docker-compose.yml fix: accept more headers and bigger requests for browsers 2023-07-21 12:20:27 +02:00
landerctl feat(lander): store filename if provided 2023-11-12 16:13:54 +01:00

README.md

Lander

The idea

A URL shortener has always been on my list of things I'd like to write myself. It's simple, yet useful.

for our Algorithms & Datastructures 3 class, we had to implement three different tries (Patricia trie, ternary trie, and a custom one). Considering these are efficient string-based search trees, this gave me the idea to use it as the backend for a URL shortener!

This implementation currently uses a ternary trie as its search tree. The persistence model is very simple; I simply append a line to a text file every time a URL is added, and add the lines of this file to the trie on startup. The trie is stored completely im memory, and no I/O operations are required when requesting a redirect. This makes the server very fast.

The name

I gave up giving my projects original names a long time ago, so now I just use the names of my friends ;p

Benchmarking

I benchmark this tool using the wrk2 utility. I've provided two Lua scripts to aid with this. To bench publishing redirects, use the following:

wrk2 -s bench/post.lua -t 10 -R 10k -d30s -c32 http://localhost:18080

And to bench GET requests:

wrk2 -s bench/get.lua -t 10 -R 25k -d30s -c32 http://localhost:18080

Of course you're free to change the parameters, but the provided Lua files generate URLs that can be used in the benchmark.