URL shortener, pastebin & file-sharing service, written from the ground up in C. https://r8r.be
 
 
 
 
 
Go to file
Jef Roosens a6b407ebaf
ci/woodpecker/push/docker Pipeline was successful Details
fix: accept more headers and bigger requests for browsers
2023-07-21 12:20:27 +02:00
.woodpecker chore(ci): add dev docker builds 2023-07-21 11:42:33 +02:00
bench chore: add benchmarking 2023-05-31 20:39:06 +02:00
include fix: accept more headers and bigger requests for browsers 2023-07-21 12:20:27 +02:00
src docs: document event & http loop 2023-05-31 16:49:15 +02:00
test chore: fix repo compilation after rebase with main 2023-05-29 18:40:36 +02:00
thirdparty refactor: clean up the code to start development properly 2023-05-29 18:28:43 +02:00
.dockerignore refactor: move steps to own file 2023-05-30 12:14:29 +02:00
.editorconfig feat: added randomly generated URLs 2022-11-21 12:03:16 +01:00
.gitignore feat: routes now specify a method 2023-05-29 18:28:46 +02:00
.woodpecker.yml feat: use more std::string 2022-11-21 14:19:56 +01:00
ARCHITECTURE.md docs: start architure file 2023-05-29 18:28:45 +02:00
Dockerfile chore: updated dockerfile 2023-05-29 18:28:46 +02:00
Makefile feat: add step to receive request body in buffer 2023-05-29 23:26:15 +02:00
README.md chore: add benchmarking 2023-05-31 20:39:06 +02:00
TRIE.md chore: ran formatter; added TRIE description file 2022-11-29 15:17:43 +01:00
docker-compose.yml fix: accept more headers and bigger requests for browsers 2023-07-21 12:20:27 +02:00
landerctl chore: update landerctl to use location header 2023-05-31 13:03:14 +02:00

README.md

Lander

The idea

A URL shortener has always been on my list of things I'd like to write myself. It's simple, yet useful.

for our Algorithms & Datastructures 3 class, we had to implement three different tries (Patricia trie, ternary trie, and a custom one). Considering these are efficient string-based search trees, this gave me the idea to use it as the backend for a URL shortener!

This implementation currently uses a ternary trie as its search tree. The persistence model is very simple; I simply append a line to a text file every time a URL is added, and add the lines of this file to the trie on startup. The trie is stored completely im memory, and no I/O operations are required when requesting a redirect. This makes the server very fast.

The name

I gave up giving my projects original names a long time ago, so now I just use the names of my friends ;p

Benchmarking

I benchmark this tool using the wrk2 utility. I've provided two Lua scripts to aid with this. To bench publishing redirects, use the following:

wrk2 -s bench/post.lua -t 10 -R 10k -d30s -c32 http://localhost:18080

And to bench GET requests:

wrk2 -s bench/get.lua -t 10 -R 25k -d30s -c32 http://localhost:18080

Of course you're free to change the parameters, but the provided Lua files generate URLs that can be used in the benchmark.