URL shortener, pastebin & file-sharing service, written from the ground up in C. https://r8r.be
 
 
 
 
 
Go to file
Jef Roosens fc4187e6ce
feat(lsm): add entry data reading
2023-10-29 14:41:40 +01:00
.woodpecker chore(ci): re-enable deploy 2023-08-03 21:44:56 +02:00
bench chore: add benchmarking 2023-05-31 20:39:06 +02:00
include refactor: decouple trie into static library 2023-07-28 18:48:44 +02:00
lsm feat(lsm): add entry data reading 2023-10-29 14:41:40 +01:00
src feat(lsm): start of library 2023-10-13 07:57:48 +02:00
test refactor: decouple trie into static library 2023-07-28 18:48:44 +02:00
thirdparty refactor: clean up the code to start development properly 2023-05-29 18:28:43 +02:00
trie chore: move TRIE.md; disable auto deployment 2023-07-28 21:13:23 +02:00
.dockerignore refactor: clean up Makefiles 2023-07-28 21:08:48 +02:00
.editorconfig feat: added randomly generated URLs 2022-11-21 12:03:16 +01:00
.gitignore feat: routes now specify a method 2023-05-29 18:28:46 +02:00
ARCHITECTURE.md docs: start architure file 2023-05-29 18:28:45 +02:00
CHANGELOG.md chore: update changelog 2023-07-27 15:02:50 +02:00
Dockerfile refactor: clean up Makefiles 2023-07-28 21:08:48 +02:00
Makefile feat(lsm): some more string functions; start of data streaming api 2023-10-25 10:57:45 +02:00
README.md chore: add benchmarking 2023-05-31 20:39:06 +02:00
config.mk feat(lsm): some more string functions; start of data streaming api 2023-10-25 10:57:45 +02:00
docker-compose.yml fix: accept more headers and bigger requests for browsers 2023-07-21 12:20:27 +02:00
landerctl chore: update landerctl to use location header 2023-05-31 13:03:14 +02:00

README.md

Lander

The idea

A URL shortener has always been on my list of things I'd like to write myself. It's simple, yet useful.

for our Algorithms & Datastructures 3 class, we had to implement three different tries (Patricia trie, ternary trie, and a custom one). Considering these are efficient string-based search trees, this gave me the idea to use it as the backend for a URL shortener!

This implementation currently uses a ternary trie as its search tree. The persistence model is very simple; I simply append a line to a text file every time a URL is added, and add the lines of this file to the trie on startup. The trie is stored completely im memory, and no I/O operations are required when requesting a redirect. This makes the server very fast.

The name

I gave up giving my projects original names a long time ago, so now I just use the names of my friends ;p

Benchmarking

I benchmark this tool using the wrk2 utility. I've provided two Lua scripts to aid with this. To bench publishing redirects, use the following:

wrk2 -s bench/post.lua -t 10 -R 10k -d30s -c32 http://localhost:18080

And to bench GET requests:

wrk2 -s bench/get.lua -t 10 -R 25k -d30s -c32 http://localhost:18080

Of course you're free to change the parameters, but the provided Lua files generate URLs that can be used in the benchmark.