Compare commits
760 Commits
weekly.202
...
master
Author | SHA1 | Date |
---|---|---|
Jef Roosens | e9aca12560 | |
Jef Roosens | ad966c29c2 | |
Jef Roosens | 0e3efb655e | |
Jef Roosens | b817f8ebc9 | |
Jef Roosens | 3d0ecd0a2d | |
Jef Roosens | 77b5845c2c | |
CC | d336b7b877 | |
Delyan Angelov | ccc3271493 | |
Emirhan Yener | e5bbb23389 | |
Wertzui123 | c10ba6d81a | |
Delyan Angelov | b0fe21f018 | |
Alexander Medvednikov | c17200c33d | |
Larpon | 298dc77c38 | |
yuyi | e9a8f5fcc7 | |
Wertzui123 | 587101a1ea | |
Delyan Angelov | 2524207d1c | |
Alexander Medvednikov | 78c527b243 | |
Alexander Medvednikov | dbc51a4579 | |
l-m | ed8c63cc0b | |
Delyan Angelov | a7108ff05c | |
Delyan Angelov | b8d9bfec16 | |
Delyan Angelov | 436081a9f5 | |
Delyan Angelov | 1b87a4770c | |
Delyan Angelov | 856270cac2 | |
Alexander Medvednikov | 989c5e26f5 | |
Alexander Medvednikov | fe673e7963 | |
Alexander Medvednikov | ae2183043b | |
Dialga | 5cd5d551e3 | |
Alexander Medvednikov | c9ab086029 | |
yuyi | e6c3de2f46 | |
Larpon | 3fb88500a2 | |
yuyi | 585b5145fa | |
Delyan Angelov | 6a4ba22eae | |
pancake | 23d1c792c0 | |
Delyan Angelov | 74fb473301 | |
Delyan Angelov | 6c060b76fd | |
Delyan Angelov | 48b2ab157b | |
yuyi | c64c4907a2 | |
Delyan Angelov | e2e3992e0d | |
ghosttk | 8172fecb51 | |
Larpon | 9f5e442dab | |
yuyi | c160ba2a8d | |
Delyan Angelov | c6f94e9cab | |
yuyi | cab6355a38 | |
Delyan Angelov | f08c768c8e | |
Ulises Jeremias Cornejo Fandos | e505fcdac0 | |
yuyi | f6f77e5264 | |
David 'Epper' Marshall | 524df8da1b | |
Delyan Angelov | 473d26ce47 | |
pancake | 1caff5b379 | |
yuyi | 8703e336e0 | |
Delyan Angelov | 1fc9e1a716 | |
Delyan Angelov | fa2e8d8459 | |
Delyan Angelov | 0e4198f23b | |
yuyi | cf1fc6f090 | |
Alexander Medvednikov | 924239026c | |
Alexander Medvednikov | bc60b0d1a3 | |
Alexander Medvednikov | d215618f4c | |
yuyi | de136f6baf | |
Delyan Angelov | 37ef1ee453 | |
Delyan Angelov | 7b1ade237b | |
Delyan Angelov | b9cb56572f | |
Alexander Medvednikov | 6875a173ec | |
Delyan Angelov | 97be840a6d | |
lemon | e0310964d9 | |
Alexander Medvednikov | 10051e005a | |
wahur666 | 18dfaf6164 | |
yuyi | 01fdd5d07f | |
Alexander Medvednikov | b89617726c | |
Louis Schmieder | 5df3d8ac75 | |
Alexander Medvednikov | fb5a40d1c8 | |
yuyi | 26714fadc5 | |
Alexander Medvednikov | 9c72b85f72 | |
Danilo Lekovic | df239b9208 | |
yuyi | f2962c34dd | |
Delyan Angelov | 205221074c | |
yuyi | 0c1708db23 | |
yuyi | 5135952c9c | |
yuyi | 7f38b92ca8 | |
yuyi | 7c50d276c7 | |
lemon | e4e858b132 | |
yuyi | 6d8a0ad15d | |
Spydr | 2f1a896d18 | |
Leo Developer | 67716b5b59 | |
yuyi | 5efa67906c | |
yuyi | 3535927bcd | |
Delyan Angelov | 139c34c07d | |
Delyan Angelov | 4682e17ac1 | |
Delyan Angelov | 7e06203da8 | |
Delyan Angelov | ff8e286c88 | |
Spydr | 5c104cf981 | |
Spydr | 8fa1e30dd2 | |
Alexander Medvednikov | f08266ab66 | |
Joe Conigliaro | f3351b6a29 | |
Delyan Angelov | 5cea8d30fa | |
yuyi | a538ab7e8c | |
yuyi | cdf4ffc513 | |
lemon | c7a619d16e | |
Dialga | da7a166708 | |
spaceface | 26d051475a | |
Delyan Angelov | b27b6b2047 | |
yuyi | fcaf529228 | |
Larpon | 690a8422d1 | |
Delyan Angelov | f4869bcdc6 | |
Delyan Angelov | ea71ea3ec1 | |
yuyi | 922f003729 | |
Larpon | be23ddc253 | |
yuyi | 784361f153 | |
Delyan Angelov | e1360ccf8c | |
Ben | 39e54a508b | |
spaceface | 4ed9780b80 | |
yuyi | e6580fefaa | |
Spydr | 8563696476 | |
Larpon | f58e5a94c2 | |
Ben | c6b1c8d07a | |
Mikey | 5ac9b5c9f1 | |
Delyan Angelov | 4b3c3d9082 | |
Larpon | 96a9faf2fd | |
David Valdespino Pavon | 1d462136bc | |
Larpon | 8027919285 | |
Delyan Angelov | 82594c0156 | |
Delyan Angelov | a942ecf737 | |
Delyan Angelov | 82d23dedf1 | |
yuyi | 7780f56c31 | |
ChAoS_UnItY | 73b59c7b16 | |
yuyi | abf35270cf | |
ChAoS_UnItY | ce26d5bc5c | |
Delyan Angelov | 778fe2cde0 | |
Delyan Angelov | 6398043094 | |
Alexander Medvednikov | 7f67981637 | |
yuyi | 8a2236d3f8 | |
Alexander Medvednikov | e89a6269e4 | |
yuyi | ce771876a3 | |
ChAoS_UnItY | df80b33dc0 | |
Leo Developer | 3a90d8ef14 | |
Ikko Ashimine | 7b25957a26 | |
ChAoS_UnItY | b000728845 | |
yuyi | 4cf6abd99d | |
ChAoS_UnItY | f6ebbc99cd | |
yuyi | 5d429140a4 | |
Wertzui123 | d71fd04c81 | |
Delyan Angelov | 3c5ae41712 | |
Delyan Angelov | 3ac3375b43 | |
Delyan Angelov | 82eb495617 | |
Delyan Angelov | f2171b4148 | |
Delyan Angelov | 4cfff58fdf | |
Alexander Medvednikov | a8461a900d | |
yuyi | 66572d5ead | |
Delyan Angelov | c15d1c6e7e | |
Delyan Angelov | 6f9070e06d | |
Delyan Angelov | dbaecdc058 | |
yuyi | daa94de93f | |
yuyi | dcbd8d6405 | |
ChAoS_UnItY | 65066098d8 | |
yuyi | 251716fa0e | |
Delyan Angelov | 9f7656f328 | |
Delyan Angelov | c892b3203e | |
Hunam | 41414b5d5f | |
Delyan Angelov | aae5b9fb95 | |
Delyan Angelov | ed759b2ec9 | |
Delyan Angelov | 031629faa1 | |
Delyan Angelov | 9a0ec7f367 | |
yuyi | 545eaae77b | |
Alexander Medvednikov | 8b0e843cb8 | |
yuyi | 10fb16e00b | |
Claudio Cesar de Sá | 5bf246fce6 | |
Ben | e201665e92 | |
Wertzui123 | f971da9a93 | |
ChAoS_UnItY | a95cdac635 | |
yuyi | 55951e0943 | |
Hunam | d0a1608ede | |
yuyi | 33a2d00445 | |
Delyan Angelov | bf70f0b436 | |
Delyan Angelov | c91b646372 | |
yuyi | 786045c7da | |
Delyan Angelov | 5a2c271bd4 | |
Delyan Angelov | 2fa64f1471 | |
yuyi | fefb9643b2 | |
Delyan Angelov | 846ddfd728 | |
Alexander Medvednikov | f40c30c3dc | |
Alexander Medvednikov | c54c9b817c | |
Larpon | 84e375e38a | |
yuyi | 80cc88427b | |
playX | db34adaec8 | |
Delyan Angelov | dc30089c74 | |
Ben | 4ffdcf8058 | |
Delyan Angelov | 928dafeb6d | |
Delyan Angelov | fc64f09f0b | |
Delyan Angelov | 0f3b2c2ae7 | |
Delyan Angelov | 58ebc0680e | |
yuyi | 844ba2a177 | |
Hunam | 78d1b7f4ef | |
yuyi | 2c5febe25e | |
yuyi | 79d861ad4f | |
Delyan Angelov | 63d15086e7 | |
Delyan Angelov | c006d5c242 | |
yuyi | c0ef6dbde8 | |
yuyi | 7dcc19df55 | |
Delyan Angelov | c6a6eb9a3c | |
yuyi | b8e8768928 | |
yuyi | a46cf10e92 | |
Delyan Angelov | 4894f61998 | |
yuyi | a971b9a99a | |
Larpon | f3e7f24ee6 | |
spaceface | 52a3e5e780 | |
Delyan Angelov | f7995c8916 | |
Delyan Angelov | 36cb552918 | |
Delyan Angelov | 156aa661ee | |
Louis Schmieder | a83ac948a0 | |
yuyi | b97ef09b2d | |
Delyan Angelov | bb6ef8bba8 | |
Delyan Angelov | 8c969efe6b | |
Wertzui123 | 1017335365 | |
yuyi | 3849cdcecc | |
Delyan Angelov | 410b57b2fa | |
Larpon | 95cc535fc7 | |
Larpon | 9f5e999b4a | |
Larpon | 6c08af63ff | |
yuyi | 59e57f0c62 | |
kahsa | dd8c96f6bc | |
Delyan Angelov | 31c234485a | |
yuyi | e19ac0c4a7 | |
yuyi | 79a75c5ac0 | |
yuyi | 0eb3f8854d | |
Ben | f431020764 | |
yuyi | f35f7fe997 | |
yuyi | a5b98cb267 | |
yuyi | 5ade39f8db | |
Delyan Angelov | 953ef1f8c9 | |
Delyan Angelov | dda49fe735 | |
yuyi | a3c0a9b791 | |
yuyi | 4ef9e2c05a | |
Alexander Medvednikov | 863eeca2e0 | |
Daniel Däschle | 5e95bdc451 | |
yuyi | 7f03b89611 | |
spaceface | ba859c584b | |
Delyan Angelov | 5328dabad1 | |
spaceface | e5ff2ab455 | |
yuyi | 1f3336c9d3 | |
Alexander Medvednikov | 245d28d57a | |
Daniel Däschle | d3ffd983c8 | |
Subhomoy Haldar | 3647fb4def | |
Delyan Angelov | 64a686f41f | |
yuyi | 50ab2cfd1a | |
Delyan Angelov | 0ceb16f285 | |
Delyan Angelov | c0dcc80e18 | |
Delyan Angelov | a7afb2d1eb | |
Ben | 971c55cf30 | |
Daniel Däschle | efc5cab8c3 | |
Alexander Medvednikov | 53c217fe5e | |
Vincenzo Palazzo | 17bba712bd | |
Daniel Däschle | d81fbb1ccd | |
Delyan Angelov | dd1049f21d | |
yuyi | 28b0cbddad | |
yuyi | 913164bc73 | |
yuyi | bf44572f30 | |
StunxFS | 11bdb04d0c | |
Delyan Angelov | ca00b59b3f | |
David 'Epper' Marshall | 120f31b4d9 | |
David 'Epper' Marshall | 23568f19da | |
crthpl | 95d24e543d | |
yuyi | 55e7daa2f9 | |
crthpl | 46f94e8d68 | |
Daniel Däschle | a52fbc5e51 | |
yuyi | 3291c59ebf | |
Delyan Angelov | 634e8c3624 | |
yuyi | 15c62bc8e8 | |
Delyan Angelov | 25812e52f0 | |
Delyan Angelov | a52590572f | |
Delyan Angelov | 3d5617c4fa | |
Delyan Angelov | 809b1ca3b4 | |
yuyi | b482c0512b | |
Delyan Angelov | 805a7d9713 | |
yuyi | 5b96f7e8fd | |
yuyi | 4cbfa884c5 | |
Delyan Angelov | f2447a4bd8 | |
Delyan Angelov | 2cc3b74e19 | |
Larpon | 9de0c725f6 | |
Adam Oates | a786c58d0a | |
yuyi | 417a6dc506 | |
Larpon | 8eea861c93 | |
Delyan Angelov | ed17779434 | |
Delyan Angelov | ebac3bebb1 | |
playX | a608516b82 | |
spaceface | b5fb848508 | |
Delyan Angelov | 65d9c8fa6f | |
Delyan Angelov | dfa2d63616 | |
Delyan Angelov | 4e56147223 | |
Alexander Medvednikov | 2a06290ac7 | |
Ned | db4b49a5ca | |
Delyan Angelov | da42f0d42b | |
Delyan Angelov | 3fc4459485 | |
yuyi | 020845f6c3 | |
yuyi | d7b1e57186 | |
yuyi | 60e817ff32 | |
yuyi | d6aa85d059 | |
playX | 7c6eaa8204 | |
Delyan Angelov | 78ab3296c9 | |
WoodyAtHome | 02c8a6057c | |
Alexander Medvednikov | d10f83ce15 | |
playX | bc397bb0e1 | |
Delyan Angelov | 32dd801201 | |
Delyan Angelov | 9cb8bb2968 | |
yuyi | c624de8523 | |
Delyan Angelov | e5c7fe3006 | |
Delyan Angelov | 39874ae168 | |
yuyi | d59f4e9479 | |
Larpon | ef6225c542 | |
yuyi | 0ab4133128 | |
spaceface | 36bec823c2 | |
WoodyAtHome | c2b763655d | |
Delyan Angelov | 1cf683d482 | |
Delyan Angelov | b4c529066a | |
Ben | cbb24d34c9 | |
yuyi | 7fe3ef9a6e | |
Larpon | 5068b8b293 | |
Jah-On | 02e026e298 | |
WoodyAtHome | eeff02a8ee | |
spaceface | c01a8a1737 | |
Delyan Angelov | c2bc9f4960 | |
crthpl | e4065bd57b | |
David 'Epper' Marshall | c28051020a | |
Delyan Angelov | b50f7fdc71 | |
Delyan Angelov | c70e18ea8f | |
Delyan Angelov | 3a09ccc80a | |
Larpon | dd6629e932 | |
David 'Epper' Marshall | 8d141878ce | |
Delyan Angelov | 67963e0ff2 | |
Delyan Angelov | 1225a865a3 | |
yuyi | fe9f97074b | |
Alexander Medvednikov | 3adad32355 | |
Alexander Medvednikov | b42c824cdb | |
Daniel Däschle | f0d46413d9 | |
JalonSolov | b3e80a3100 | |
j. redhead | 441637eeb4 | |
Delyan Angelov | cee7856c0f | |
Delyan Angelov | 714ce4e7fc | |
Delyan Angelov | c1bafe7a5a | |
playX | 6ec4185017 | |
Daniel Däschle | d407a6449d | |
Delyan Angelov | ed12a5c84c | |
yuyi | 3c95504a35 | |
Daniel Däschle | d679146a80 | |
Delyan Angelov | df029da942 | |
Daniel Däschle | 0972e67f72 | |
Delyan Angelov | 8ef9dc6247 | |
Delyan Angelov | 668d1b04d2 | |
Hunam | 20139ad756 | |
playX | 4952967366 | |
yuyi | f48f7014f0 | |
penguindark | e93a8766e5 | |
Delyan Angelov | b7ca4c1668 | |
yuyi | 8830af5c89 | |
yuyi | 5bc4fea9e0 | |
CC | 901b8f0c24 | |
spaceface | 49382f1f43 | |
Emily Hudson | c19b037880 | |
yuyi | cd4fa041ff | |
Larpon | 34a252ef84 | |
David 'Epper' Marshall | 26b81d68b5 | |
Larpon | 0ec1c8d9f0 | |
yuyi | 3afc7c4c6d | |
Delyan Angelov | cf536b848b | |
yuyi | 8f765ed5f1 | |
yuyi | 5697d4375b | |
yuyi | 606d8cfaca | |
Isaiah | 9e09b709e3 | |
yuyi | 940c78bdfd | |
Subhomoy Haldar | 79f8a3c796 | |
StunxFS | d24dce8eb3 | |
Dialga | 4400f9891e | |
yuyi | 8519996201 | |
Merlin Diavova | 106487d62f | |
David 'Epper' Marshall | 650fb493bd | |
Delyan Angelov | 084f2867b6 | |
WoodyAtHome | a0a3499bdc | |
Alexander Medvednikov | 0526499d5f | |
Merlin Diavova | f8747d05dc | |
StunxFS | d5e70552eb | |
David 'Epper' Marshall | aef95721a4 | |
yuyi | b04d46770b | |
yuyi | 724e7f037a | |
David 'Epper' Marshall | a91226c376 | |
Delyan Angelov | b53b1cc7cb | |
Delyan Angelov | 7ecd65221e | |
Delyan Angelov | 56cf0b0a2e | |
Delyan Angelov | 7f974a275a | |
Delyan Angelov | f956acd2f6 | |
Delyan Angelov | 9e8e364493 | |
Alexander Medvednikov | ca42ace367 | |
Alexander Medvednikov | 35cfa0da7c | |
playX | 6a6c005dc0 | |
Alexander Medvednikov | 9fb8de14dd | |
Alexander Medvednikov | 89c1e7f980 | |
yuyi | 70184ad1f8 | |
Alexander Medvednikov | 14f06ead1b | |
yuyi | 621574c12a | |
Hunam | 0699f324b5 | |
Lathanao | ce99a306c0 | |
StunxFS | 87de6df0e6 | |
Ekopalypse | 2027a1969b | |
Daniel Däschle | 76cdf75299 | |
yuyi | 45fe87c9e3 | |
yuyi | 3091f31019 | |
Claudio Cesar de Sá | 634796ae42 | |
Andréas Livet | 9fde5b067b | |
Daniel Däschle | 89fe82b732 | |
yuyi | b6058bfd6e | |
playX | 8afdb1c3ef | |
StunxFS | 7499506cf8 | |
Delyan Angelov | 785e9af8f1 | |
Delyan Angelov | 7170a09382 | |
Delyan Angelov | 01c1892995 | |
Delyan Angelov | a6b3e5d6a5 | |
Ned | 76a7354506 | |
yuyi | 4242e7610f | |
yuyi | e2aa5c9b3f | |
Ikko Ashimine | 223b96a59a | |
Delyan Angelov | 1a4d9017e2 | |
Alexander Medvednikov | af8be14639 | |
yuyi | ac90a2b53d | |
playX | 3bd6455178 | |
Alexander Medvednikov | 3d4b8dffdf | |
yuyi | f321422964 | |
Alexander Medvednikov | 1e9156fd71 | |
yuyi | 3732db2bcc | |
playX | 146051b231 | |
Delyan Angelov | 04a77c731e | |
yuyi | 63eacede95 | |
Hunam | 6da300428e | |
yuyi | 276bd8060c | |
Delyan Angelov | 0e5c1cee48 | |
Delyan Angelov | 4da2908d63 | |
StunxFS | cf92224248 | |
Delyan Angelov | ab1c265679 | |
spaceface | db185598d2 | |
yuyi | 990afe37e1 | |
yuyi | d72a25098a | |
yuyi | 25c1b174ca | |
yuyi | b9cf2db6a8 | |
playX | afbe6bf3a2 | |
Isaiah | a4fd349cf1 | |
yuyi | 968d2b4654 | |
spaceface | 332e821518 | |
Delyan Angelov | aed2d0caf2 | |
David 'Epper' Marshall | 91c1157810 | |
Daniel Däschle | ec92d467d1 | |
playX | 0b54196962 | |
Daniel Däschle | 9f8a34a528 | |
Delyan Angelov | 63d413f93c | |
Delyan Angelov | c0b37409d2 | |
Delyan Angelov | 8da42bfc85 | |
Delyan Angelov | 5277ce7dce | |
David 'Epper' Marshall | a2338dbb7c | |
Delyan Angelov | dcdfdf4dd8 | |
spaceface | dab649ec8a | |
Alexander Medvednikov | ce31a01a70 | |
yuyi | cd30b6ea82 | |
Daniel Däschle | 08fd0ce0de | |
StunxFS | db185e6580 | |
Benjamin Thomas | 48eb40cd2c | |
David 'Epper' Marshall | 881d0c04f1 | |
yuyi | ec865cfb37 | |
yuyi | 317acfda97 | |
Wertzui123 | 872f739396 | |
StunxFS | 995485c649 | |
StunxFS | 8b798acadd | |
Alexander Medvednikov | 77645fcf35 | |
spaceface | 14309594fe | |
David 'Epper' Marshall | 5a42350a78 | |
StunxFS | e24482a143 | |
playX | e56385d57d | |
yuyi | 7aca67fb60 | |
yuyi | dd94ab890a | |
yuyi | c802688690 | |
Atom | a225b25117 | |
Delyan Angelov | 4538efd8f4 | |
Delyan Angelov | e0ed8f8278 | |
Delyan Angelov | f72297c331 | |
Delyan Angelov | be04ec0620 | |
yuyi | 7dd5d9ee61 | |
yuyi | 09f8b6a380 | |
tzSharing | eb03fad934 | |
Delyan Angelov | f53b9b4f12 | |
Alexander Medvednikov | 7dbfa86f25 | |
yuyi | 82ac39eca6 | |
yuyi | 752e105f25 | |
tzSharing | 85f616877f | |
playX | b76095f28a | |
R cqls | c26b7666c7 | |
yuyi | be513b4c27 | |
Larpon | 1c48a8d760 | |
yuyi | 660201c188 | |
Brian Callahan | b9a0e2d285 | |
yuyi | 38afd74d26 | |
playX | 95880dfe5c | |
playX | dce2173ac9 | |
Alexander Medvednikov | 501b293e84 | |
Larpon | d799abd139 | |
yuyi | e42dc8e228 | |
yuyi | f89c81087b | |
yuyi | aeba110d01 | |
Larpon | 283d181047 | |
yuyi | 88f22b4367 | |
Delyan Angelov | 60e205a193 | |
Delyan Angelov | d35d67c2bd | |
Nick Treleaven | d8a5df9044 | |
Delyan Angelov | 147e6e669f | |
yuyi | 922cee9162 | |
Delyan Angelov | 1291b621f6 | |
fleur | ddbe812f1b | |
Haren S | 11ee2b6409 | |
stackotter | 563469ed9f | |
yuyi | c819f0f86f | |
Alexander Medvednikov | 9355048b6c | |
Delyan Angelov | 3388caa6c5 | |
Delyan Angelov | 365e7d6b34 | |
yuyi | d934472b17 | |
yuyi | b86320a669 | |
Claudio Cesar de Sá | a2db44bc38 | |
yuyi | 5dce091379 | |
Alexander Medvednikov | 9b565bf765 | |
tzSharing | 03d21a727e | |
Larpon | 506259adb6 | |
yuyi | 26b0e7fd34 | |
Delyan Angelov | 2080557f50 | |
yuyi | 8a18f9175a | |
yuyi | 448938be0d | |
yuyi | 99eb9fdaab | |
yuyi | f13583b04a | |
yuyi | 8013bd43b0 | |
Delyan Angelov | 5e8c4a3aff | |
StunxFS | 2a0b372d0d | |
ChAoS_UnItY | c5824c36f2 | |
Delyan Angelov | c789ea5a15 | |
sunnylcw | 4491b535ec | |
Delyan Angelov | 31b28af179 | |
lemon | 960225f7a7 | |
Nick Treleaven | 1533b77404 | |
yuyi | 0260c2a552 | |
Hunam | 0374f021c5 | |
mjh | 1546645f63 | |
Larpon | a1342e85c3 | |
yuyi | 52ea0b8cc3 | |
yuyi | ce4c2afc9c | |
yuyi | 44ba19716b | |
yuyi | 0c3b69eaef | |
Alexander Medvednikov | 364656b312 | |
Isaiah | 0887b59254 | |
yuyi | 8cc79e4299 | |
playX | 711e90cf99 | |
playX | f6a0c26a85 | |
Delyan Angelov | 9646e4b9d8 | |
Delyan Angelov | 006df58451 | |
JalonSolov | daf5d32327 | |
yuyi | a318a2e09e | |
yuyi | e16ce3af88 | |
yuyi | 6164654d11 | |
Nick Treleaven | 4400efeb9f | |
yuyi | 7ef64bde50 | |
yuyi | d0a11f50ca | |
Delyan Angelov | 379b638b57 | |
yuyi | fe371845da | |
spaceface | 775c4c34b5 | |
Ikko Ashimine | 56a3539ea9 | |
yuyi | 17c34b09a6 | |
yuyi | cb44f5981e | |
Vincenzo Palazzo | 4f14f7714f | |
Vincenzo Palazzo | 48486e1afb | |
Cameron Katri | 1fc54a1e5b | |
Delyan Angelov | 8a57f7ed2d | |
Julien de Carufel | 16ead4e63c | |
Alexander Medvednikov | 43931a8e77 | |
Delyan Angelov | 5b7e538119 | |
Delyan Angelov | 4a71b27c52 | |
Delyan Angelov | d75c408868 | |
Delyan Angelov | 82c5621621 | |
Delyan Angelov | 675f8b6300 | |
Delyan Angelov | ad231cec2f | |
Delyan Angelov | 87a373d82c | |
Delyan Angelov | c7aedb8e8d | |
Delyan Angelov | 375361b787 | |
Delyan Angelov | 840f474fb5 | |
Delyan Angelov | e802e0b9cb | |
Delyan Angelov | bb2a324d61 | |
Delyan Angelov | 868d3e1008 | |
Alexander Medvednikov | c03fe020bf | |
Delyan Angelov | 6f5a513d8b | |
Delyan Angelov | e18cb9748f | |
Delyan Angelov | 173e6a943b | |
Delyan Angelov | cc8803c602 | |
Alexander Medvednikov | 78cb6e2b41 | |
Alexander Medvednikov | 1c6f63ac0a | |
Alexander Medvednikov | fbb9e65c0f | |
Alexander Medvednikov | ae6a25f44e | |
Alexander Medvednikov | e97ebf8cfc | |
Alexander Medvednikov | 258d1f77dc | |
Alexander Medvednikov | af73e195da | |
Alexander Medvednikov | fb192d949b | |
Alexander Medvednikov | 0527ac633e | |
Alexander Medvednikov | dbcf6e9c33 | |
Alexander Medvednikov | c14984899b | |
Alexander Medvednikov | a1372e284c | |
Alexander Medvednikov | c3ad4e2069 | |
Alexander Medvednikov | d4a0d6f73c | |
Alexander Medvednikov | b49d873217 | |
Alexander Medvednikov | 014c3c97f0 | |
Alexander Medvednikov | 7f3b91e688 | |
Alexander Medvednikov | 1e7eb713fb | |
Alexander Medvednikov | ba7b329c73 | |
Alexander Medvednikov | e6ff1508d2 | |
Delyan Angelov | 566f150b24 | |
Delyan Angelov | c3ee4fb2a2 | |
Alexander Medvednikov | 3b36f16365 | |
Alexander Medvednikov | 0dff050735 | |
yuyi | 2d6d6c9ac9 | |
yuyi | c4dff0d797 | |
yuyi | 72c2dc805d | |
Delyan Angelov | f6c9a60f99 | |
Delyan Angelov | bf62b2e33e | |
Delyan Angelov | f1f75897b3 | |
Delyan Angelov | a62560d2c1 | |
Delyan Angelov | e555335bf0 | |
fleur | 68401d9dc8 | |
Delyan Angelov | 5905590e78 | |
Vincenzo Palazzo | 48c295150f | |
yuyi | dc08105022 | |
Delyan Angelov | e5809363de | |
Alexander Medvednikov | d7adb67d52 | |
Alexander Medvednikov | 2525a30b5f | |
yuyi | e3e5bef139 | |
Nick Treleaven | c780de6282 | |
fleur | 6718958058 | |
牧心 | a810fbb80e | |
Delyan Angelov | 8788512c4d | |
Delyan Angelov | 4c7cdd2a2d | |
Delyan Angelov | 62032c43db | |
Delyan Angelov | 9b43713ec5 | |
Delyan Angelov | 716cb17aea | |
crthpl | afb07e0e16 | |
yuyi | 3e3b2e25db | |
Larpon | 07207db998 | |
Delyan Angelov | cb969e0934 | |
Delyan Angelov | 382586da6d | |
Delyan Angelov | e64c8cce62 | |
yuyi | 5551cb248c | |
牧心 | 25d8faabf6 | |
Delyan Angelov | 843ce43077 | |
Delyan Angelov | e4dfffd70b | |
Nick Treleaven | 1938bc48e7 | |
yuyi | fa66183f43 | |
Delyan Angelov | a0e7a46be4 | |
Delyan Angelov | 6c25f5b291 | |
Subhomoy Haldar | 3f90809035 | |
牧心 | 11d9a67e3b | |
Delyan Angelov | 8517b8f8b0 | |
Delyan Angelov | 88c4a64a15 | |
yuyi | 93a5d03182 | |
Vincenzo Palazzo | 3571f66a82 | |
yuyi | 473bc0254d | |
Delyan Angelov | 89d64b21ea | |
Delyan Angelov | df30b79971 | |
Daniel Oberhoff | 58febe4607 | |
yuyi | 704e3c6e72 | |
Nick Treleaven | bf385d2ac9 | |
pancake | 804f2f56d4 | |
pancake | e3da3101f6 | |
Delyan Angelov | 60e718e7c6 | |
Vincenzo Palazzo | 2d867a2766 | |
yuyi | 2a88b313d4 | |
yuyi | 617608b23d | |
Larpon | 45a427e68b | |
Larpon | 52f1c615a6 | |
Joe Conigliaro | a0c07454b1 | |
Joe Conigliaro | 426e9d1734 | |
Joe Conigliaro | e1c8b07fa5 | |
Joe Conigliaro | 8dc2601080 | |
mir.zhou | 6425000ce4 | |
Delyan Angelov | 5a695c81dc | |
crthpl | b232a3b0d1 | |
Delyan Angelov | eea46c4e1a | |
Subhomoy Haldar | 022fae1e7f | |
Pascal Masschelier | 95753ffb30 | |
playX | 6a820c2845 | |
Vincenzo Palazzo | 4666a27e5f | |
Larpon | d34ef69229 | |
yuyi | e7fd8c4e7c | |
yuyi | a58dde48f8 | |
yuyi | c9dcdf6744 | |
Larpon | 56e6fd01c5 | |
Delyan Angelov | c5d8d27b90 | |
Vincenzo Palazzo | 6412f8ba0b | |
Larpon | 1482db6d1a | |
Vincenzo Palazzo | 359f16fdfd | |
yuyi | 0cba579a7b | |
Delyan Angelov | 2ecfd1b351 | |
Larpon | 11ccf06441 | |
Larpon | 2350dbbd57 | |
Larpon | 0b046c14a8 | |
Delyan Angelov | f5e4d17cf3 | |
Alexander Medvednikov | 829fed4af0 | |
Nick Treleaven | 7d8db1042d | |
yuyi | f6b8e1e13f | |
Delyan Angelov | 0bd8fbc9a8 | |
Nick Treleaven | 2cd9c91e98 | |
Larpon | aa9e2ebb25 | |
pancake | 5369379738 | |
Alexander Medvednikov | 92bfd9b353 | |
StunxFS | 725b472d37 | |
Nick Treleaven | 91b40304b7 | |
StunxFS | 1211b2e941 | |
Alexander Medvednikov | 719a3b5de3 | |
Alexander Medvednikov | a55e930c00 | |
Alexander Medvednikov | cc227d8520 | |
Nick Treleaven | d10135e2c4 | |
Delyan Angelov | 44603f8e59 | |
Vincenzo Palazzo | 51c1d666c2 | |
StunxFS | 38853568b4 | |
Nick Treleaven | 782d5374c9 | |
Delyan Angelov | a1e9cae5d2 | |
yuyi | 0497b885dc | |
yuyi | 8c55a9ecd3 | |
Cameron Katri | 340543dfc0 | |
Delyan Angelov | 71dc6c224a | |
Cameron Katri | d585fbea8a | |
Nick Treleaven | 42f92db0ab | |
yuyi | faa55b46de | |
yuyi | 0bf0c73a49 | |
pancake | af79c1e6ef | |
Vincenzo Palazzo | d7817863c6 | |
Delyan Angelov | 9d2529b611 | |
yuyi | 6987f2c087 | |
Vincenzo Palazzo | 02c80bd445 | |
Nick Treleaven | 9c1981a309 | |
yuyi | bc98c11d9d | |
yuyi | db3bbb58cf | |
Nick Treleaven | a87cd9663e | |
R cqls | 5c43493183 | |
Delyan Angelov | fb5df9665e | |
Delyan Angelov | fa3fa2e74f | |
Cameron Katri | b15240185e | |
Delyan Angelov | 4222fd0862 | |
Delyan Angelov | 74eabba52e | |
kylepritchard | 6137ce23c0 | |
yuyi | ae1cb5697e | |
yuyi | d40a502981 | |
Delyan Angelov | 61f078664c | |
Subhomoy Haldar | 7ef7188f4b | |
Delyan Angelov | 8121a8ada0 | |
Delyan Angelov | 566735b298 | |
Cameron Katri | 093994655c | |
Cameron Katri | 3e69d3813b | |
Delyan Angelov | cc637e5ee8 | |
yuyi | 55d9464890 | |
yuyi | 42a67831bf | |
yuyi | c71770d9c5 | |
Delyan Angelov | 04cc037955 | |
Delyan Angelov | 7ee93c8a20 | |
yuyi | fd34ebd84e | |
Delyan Angelov | 4f551d76c0 |
74
.cirrus.yml
74
.cirrus.yml
|
@ -13,77 +13,3 @@ freebsd_task:
|
|||
##tcc -v -v
|
||||
echo 'Build cmd/tools/fast'
|
||||
cd cmd/tools/fast && ../../../v fast.v && ./fast -clang
|
||||
|
||||
|
||||
arm64_task:
|
||||
name: Code CI / arm64-ubuntu-tcc
|
||||
trigger_type: manual
|
||||
arm_container:
|
||||
image: ubuntu:latest
|
||||
install_script: apt-get update -y && apt-get install --quiet -y build-essential pkg-config wget git valgrind libsqlite3-dev libssl-dev libxi-dev libxcursor-dev libfreetype6-dev libxi-dev libxcursor-dev libgl-dev xfonts-75dpi xfonts-base libmysqlclient-dev libpq-dev gcc-10-arm-linux-gnueabihf libc6-dev-armhf-cross qemu-user
|
||||
env:
|
||||
DEBIAN_FRONTEND: noninteractive
|
||||
VFLAGS: -cc tcc -no-retry-compilation
|
||||
VJOBS: 2
|
||||
script: |
|
||||
set -e
|
||||
|
||||
wget https://github.com/wkhtmltopdf/packaging/releases/download/0.12.6-1/wkhtmltox_0.12.6-1.focal_arm64.deb
|
||||
apt install --fix-missing -y ./wkhtmltox_0.12.6-1.focal_arm64.deb
|
||||
|
||||
# ensure that a V binary can be built, even if tcc has broken for some reason
|
||||
VFLAGS='-cc gcc' make
|
||||
|
||||
./v -g self
|
||||
./v -g self
|
||||
|
||||
./v -d debug_malloc -d debug_realloc -o v cmd/v
|
||||
./v -cg -cstrict -o v cmd/v
|
||||
#Test v->c
|
||||
thirdparty/tcc/tcc.exe -version
|
||||
./v -cg -o v cmd/v # Make sure vtcc can build itself twice
|
||||
|
||||
# - name: v self compilation
|
||||
./v -o v2 cmd/v && ./v2 -o v3 cmd/v && ./v3 -o v4 cmd/v
|
||||
|
||||
# - name: v self compilation with -skip-unused
|
||||
./v -skip-unused -o v2 cmd/v && ./v2 -skip-unused -o v3 cmd/v && ./v3 -skip-unused -o v4 cmd/v
|
||||
|
||||
# - name: v doctor
|
||||
./v doctor
|
||||
|
||||
# - name: Verify `v test` works
|
||||
./v cmd/tools/test_if_v_test_system_works.v
|
||||
./cmd/tools/test_if_v_test_system_works
|
||||
|
||||
# - name: Self tests
|
||||
./v test-self
|
||||
|
||||
## - name: Self tests (-cstrict)
|
||||
## ./v -cstrict test-self
|
||||
|
||||
# - name: Test time functions in a timezone UTC-12
|
||||
TZ=Etc/GMT+12 ./v test vlib/time/
|
||||
# - name: Test time functions in a timezone UTC-3
|
||||
TZ=Etc/GMT+3 ./v test vlib/time/
|
||||
# - name: Test time functions in a timezone UTC+3
|
||||
TZ=Etc/GMT-3 ./v test vlib/time/
|
||||
# - name: Test time functions in a timezone UTC+12
|
||||
TZ=Etc/GMT-12 ./v test vlib/time/
|
||||
# - name: Test time functions in a timezone using daylight saving (Europe/Paris)
|
||||
TZ=Europe/Paris ./v test vlib/time/
|
||||
# - name: Build examples
|
||||
./v -W build-examples
|
||||
# - name: Test building v tools
|
||||
./v -W build-tools
|
||||
# - name: Test v binaries
|
||||
./v build-vbinaries
|
||||
# - name: Run a VSH script
|
||||
./v run examples/v_script.vsh
|
||||
# - name: Test v tutorials
|
||||
./v tutorials/building_a_simple_web_blog_with_vweb/code/blog
|
||||
|
||||
# test the arm32 version of tcc
|
||||
# TODO: support something like `V_EMULATOR=qemu-arm v run file.v` so that V automatically runs all binaries under qemu
|
||||
./v -arch arm32 -cc arm-linux-gnueabihf-gcc-10 -o av cmd/v && qemu-arm -L /usr/arm-linux-gnueabihf ./av -arch arm32 -cc arm-linux-gnueabihf-gcc-10 -o av2 cmd/v && qemu-arm -L /usr/arm-linux-gnueabihf ./av2 -arch arm32 -cc arm-linux-gnueabihf-gcc-10 -o av3 cmd/v && qemu-arm -L /usr/arm-linux-gnueabihf ./av3 -arch arm32 -cc arm-linux-gnueabihf-gcc-10 -o av4 cmd/v
|
||||
./v -arch arm32 -o closure_test.c vlib/v/tests/closure_test.v && arm-linux-gnueabihf-gcc-10 -o closure_test closure_test.c && qemu-arm -L /usr/arm-linux-gnueabihf ./closure_test
|
||||
|
|
|
@ -15,6 +15,8 @@ ls -lat
|
|||
## try running the known failing tests first to get faster feedback
|
||||
./v test vlib/builtin/string_test.v vlib/strings/builder_test.v
|
||||
|
||||
./v test-cleancode
|
||||
|
||||
./v test-self
|
||||
|
||||
./v build-vbinaries
|
||||
|
|
|
@ -0,0 +1,67 @@
|
|||
#!/usr/bin/env -S v
|
||||
|
||||
module main
|
||||
|
||||
import os
|
||||
import vab.vxt
|
||||
import vab.android.ndk
|
||||
|
||||
fn main() {
|
||||
assert ndk.found()
|
||||
assert vxt.found()
|
||||
|
||||
work_dir := os.join_path(os.temp_dir(), 'android_cross_compile_test')
|
||||
os.rm(work_dir) or {}
|
||||
os.mkdir_all(work_dir) or { panic(err) }
|
||||
vexe := vxt.vexe()
|
||||
|
||||
examples_dir := os.join_path(vxt.home(), 'examples')
|
||||
v_example := os.join_path(examples_dir, 'toml.v')
|
||||
|
||||
ndk_version := ndk.default_version()
|
||||
|
||||
sysroot_path := ndk.sysroot_path(ndk_version) or { panic(err) }
|
||||
include_path := os.join_path(sysroot_path, 'usr', 'include')
|
||||
android_include_path := os.join_path(include_path, 'android')
|
||||
|
||||
//'-I"$include_path"'
|
||||
cflags := ['-I"$android_include_path"', '-Wno-unused-value', '-Wno-implicit-function-declaration',
|
||||
'-Wno-int-conversion']
|
||||
for arch in ndk.supported_archs {
|
||||
for level in ['min', 'max'] {
|
||||
compiler_api := match level {
|
||||
'min' {
|
||||
ndk.compiler_min_api(.c, ndk_version, arch) or { panic(err) }
|
||||
}
|
||||
'max' {
|
||||
ndk.compiler_max_api(.c, ndk_version, arch) or { panic(err) }
|
||||
}
|
||||
else {
|
||||
panic('invalid min/max level')
|
||||
}
|
||||
}
|
||||
|
||||
os.setenv('VCROSS_COMPILER_NAME', compiler_api, true)
|
||||
c_file := os.join_path(work_dir, arch + '-' + level + '.c')
|
||||
o_file := os.join_path(work_dir, arch + '-' + level + '.o')
|
||||
|
||||
// x.v -> x.c
|
||||
v_compile_cmd := '$vexe -o $c_file -os android -gc none $v_example'
|
||||
vres := os.execute(v_compile_cmd)
|
||||
if vres.exit_code != 0 {
|
||||
panic('"$v_compile_cmd" failed: $vres.output')
|
||||
}
|
||||
assert os.exists(c_file)
|
||||
|
||||
// x.c -> x.o
|
||||
compile_cmd := '$compiler_api ${cflags.join(' ')} -c $c_file -o $o_file'
|
||||
cres := os.execute(compile_cmd)
|
||||
if cres.exit_code != 0 {
|
||||
panic('"$compile_cmd" failed: $cres.output')
|
||||
}
|
||||
assert os.exists(o_file)
|
||||
compiler_exe_name := os.file_name(compiler_api)
|
||||
println('Compiled examples/toml.v successfully for ($level) $arch $compiler_exe_name')
|
||||
}
|
||||
}
|
||||
}
|
|
@ -15,6 +15,7 @@ concurrency:
|
|||
jobs:
|
||||
ubuntu-tcc:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
env:
|
||||
VFLAGS: -cc tcc -no-retry-compilation
|
||||
|
@ -33,6 +34,7 @@ jobs:
|
|||
run: |
|
||||
echo $VFLAGS
|
||||
make
|
||||
./v test-cleancode
|
||||
./v -d debug_malloc -d debug_realloc -o v cmd/v
|
||||
./v -cg -cstrict -o v cmd/v
|
||||
# Test v -realloc arena allocation
|
||||
|
@ -54,10 +56,12 @@ jobs:
|
|||
echo $VFLAGS
|
||||
./v cmd/tools/test_if_v_test_system_works.v
|
||||
./cmd/tools/test_if_v_test_system_works
|
||||
- name: All code is formatted
|
||||
run: ./v test-cleancode
|
||||
- name: Self tests
|
||||
run: ./v test-self
|
||||
# - name: Self tests (-cstrict)
|
||||
# run: ./v -cstrict test-self
|
||||
# run: V_CI_CSTRICT=1 ./v -cstrict test-self
|
||||
- name: Test time functions in a timezone UTC-12
|
||||
run: TZ=Etc/GMT+12 ./v test vlib/time/
|
||||
- name: Test time functions in a timezone UTC-3
|
||||
|
@ -91,6 +95,7 @@ jobs:
|
|||
|
||||
ubuntu-tcc-boehm-gc:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
env:
|
||||
VFLAGS: -cc tcc -no-retry-compilation
|
||||
|
@ -125,13 +130,15 @@ jobs:
|
|||
run: |
|
||||
./v -gc boehm cmd/tools/test_if_v_test_system_works.v
|
||||
./cmd/tools/test_if_v_test_system_works
|
||||
- name: All code is formatted
|
||||
run: ./v test-cleancode
|
||||
- name: Self tests with `-gc boehm` with V compiler using Boehm-GC itself
|
||||
run: ./v -gc boehm test-self
|
||||
- name: Test leak detector
|
||||
run: |
|
||||
./v -gc boehm_leak -o testcase_leak vlib/v/tests/testcase_leak.vv
|
||||
./testcase_leak 2>leaks.txt
|
||||
grep "Found 1 leaked object" leaks.txt && grep ", sz=1000," leaks.txt
|
||||
grep "Found 1 leaked object" leaks.txt && grep -P ", sz=\s?1000," leaks.txt
|
||||
- name: Test leak detector not being active for `-gc boehm`
|
||||
run: |
|
||||
./v -gc boehm -o testcase_leak vlib/v/tests/testcase_leak.vv
|
||||
|
@ -144,7 +151,8 @@ jobs:
|
|||
[ "$(stat -c %s leaks.txt)" = "0" ]
|
||||
|
||||
macos:
|
||||
runs-on: macOS-latest
|
||||
runs-on: macOS-12
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
env:
|
||||
VFLAGS: -cc clang
|
||||
|
@ -188,6 +196,8 @@ jobs:
|
|||
echo $VFLAGS
|
||||
./v cmd/tools/test_if_v_test_system_works.v
|
||||
./cmd/tools/test_if_v_test_system_works
|
||||
- name: All code is formatted
|
||||
run: VJOBS=1 ./v test-cleancode
|
||||
- name: Self tests
|
||||
run: VJOBS=1 ./v test-self
|
||||
- name: Build examples
|
||||
|
@ -232,6 +242,7 @@ jobs:
|
|||
|
||||
ubuntu:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
@ -261,8 +272,8 @@ jobs:
|
|||
## run: ./v -o hi.js examples/hello_v_js.v && node hi.js
|
||||
# - name: Build Vorum
|
||||
# run: git clone --depth 1 https://github.com/vlang/vorum && cd vorum && ../v . && cd ..
|
||||
# - name: Build vpm
|
||||
# run: git clone --depth 1 https://github.com/vlang/vpm && cd vpm && ../v . && cd ..
|
||||
- name: Build vpm
|
||||
run: git clone --depth 1 https://github.com/vlang/vpm && cd vpm && ../v . && cd ..
|
||||
- name: Freestanding
|
||||
run: ./v -freestanding run vlib/os/bare/bare_example_linux.v
|
||||
- name: V self compilation
|
||||
|
@ -280,12 +291,14 @@ jobs:
|
|||
echo $VFLAGS
|
||||
./v cmd/tools/test_if_v_test_system_works.v
|
||||
./cmd/tools/test_if_v_test_system_works
|
||||
- name: All code is formatted
|
||||
run: ./v test-cleancode
|
||||
- name: Self tests
|
||||
run: ./v test-self
|
||||
- name: Self tests (-prod)
|
||||
run: ./v -o vprod -prod cmd/v && ./vprod test-self
|
||||
- name: Self tests (-cstrict)
|
||||
run: ./v -cc gcc -cstrict test-self
|
||||
run: VTEST_JUST_ESSENTIAL=1 V_CI_CSTRICT=1 ./v -cc gcc -cstrict test-self
|
||||
- name: Build examples
|
||||
run: ./v build-examples
|
||||
- name: Build tetris.v with -autofree
|
||||
|
@ -329,6 +342,7 @@ jobs:
|
|||
|
||||
ubuntu-clang:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
env:
|
||||
VFLAGS: -cc clang
|
||||
|
@ -372,12 +386,18 @@ jobs:
|
|||
echo $VFLAGS
|
||||
./v cmd/tools/test_if_v_test_system_works.v
|
||||
./cmd/tools/test_if_v_test_system_works
|
||||
- name: All code is formatted
|
||||
run: ./v test-cleancode
|
||||
|
||||
- name: Self tests
|
||||
run: ./v test-self
|
||||
- name: Self tests (-prod)
|
||||
run: ./v -o vprod -prod cmd/v && ./vprod test-self
|
||||
- name: Self tests (vprod)
|
||||
run: |
|
||||
./v -o vprod -prod cmd/v
|
||||
./vprod test-self
|
||||
- name: Self tests (-cstrict)
|
||||
run: ./v -cstrict test-self
|
||||
run: VTEST_JUST_ESSENTIAL=1 V_CI_CSTRICT=1 ./vprod -cstrict test-self
|
||||
|
||||
- name: Build examples
|
||||
run: ./v build-examples
|
||||
- name: Build examples with -autofree
|
||||
|
@ -413,6 +433,7 @@ jobs:
|
|||
|
||||
windows-gcc:
|
||||
runs-on: windows-2019
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
env:
|
||||
VFLAGS: -cc gcc
|
||||
|
@ -429,7 +450,7 @@ jobs:
|
|||
- name: Test new v.c
|
||||
run: |
|
||||
.\v.exe -o v.c cmd/v
|
||||
gcc -Werror -I ./thirdparty/stdatomic/win -municode -w v.c
|
||||
gcc -Werror -municode -w v.c
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
.\v.exe setup-freetype
|
||||
|
@ -442,10 +463,28 @@ jobs:
|
|||
echo $VFLAGS
|
||||
./v cmd/tools/test_if_v_test_system_works.v
|
||||
./cmd/tools/test_if_v_test_system_works
|
||||
- name: All code is formatted
|
||||
run: ./v test-cleancode
|
||||
- name: Self tests
|
||||
run: .\v.exe test-self
|
||||
# - name: Test
|
||||
# run: .\v.exe test-all
|
||||
- name: Test time functions in a timezone UTC-12
|
||||
run: |
|
||||
tzutil /s "Dateline Standard Time"
|
||||
./v test vlib/time/
|
||||
- name: Test time functions in a timezone UTC-3
|
||||
run: |
|
||||
tzutil /s "Greenland Standard Time"
|
||||
./v test vlib/time/
|
||||
- name: Test time functions in a timezone UTC+3
|
||||
run: |
|
||||
tzutil /s "Russian Standard Time"
|
||||
./v test vlib/time/
|
||||
- name: Test time functions in a timezone UTC+12
|
||||
run: |
|
||||
tzutil /s "New Zealand Standard Time"
|
||||
./v test vlib/time/
|
||||
- name: Test v->js
|
||||
run: ./v -o hi.js examples/hello_v_js.v && node hi.js
|
||||
- name: Test v binaries
|
||||
|
@ -457,6 +496,7 @@ jobs:
|
|||
|
||||
windows-msvc:
|
||||
runs-on: windows-2019
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
env:
|
||||
VFLAGS: -cc msvc
|
||||
|
@ -484,6 +524,9 @@ jobs:
|
|||
echo $VFLAGS
|
||||
./v cmd/tools/test_if_v_test_system_works.v
|
||||
./cmd/tools/test_if_v_test_system_works
|
||||
### TODO: test-cleancode fails with msvc. Investigate why???
|
||||
## - name: All code is formatted
|
||||
## run: ./v test-cleancode
|
||||
- name: Self tests
|
||||
run: |
|
||||
./v -cg cmd\tools\vtest-self.v
|
||||
|
@ -501,6 +544,7 @@ jobs:
|
|||
|
||||
windows-tcc:
|
||||
runs-on: windows-2019
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
env:
|
||||
VFLAGS: -cc tcc -no-retry-compilation
|
||||
|
@ -518,7 +562,7 @@ jobs:
|
|||
- name: Test new v.c
|
||||
run: |
|
||||
.\v.exe -o v.c cmd/v
|
||||
.\thirdparty\tcc\tcc.exe -I ./thirdparty/stdatomic/win -Werror -w -ladvapi32 -bt10 v.c
|
||||
.\thirdparty\tcc\tcc.exe -Werror -w -ladvapi32 -bt10 v.c
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
.\v.exe setup-freetype
|
||||
|
@ -539,8 +583,8 @@ jobs:
|
|||
run: ./v doc -v clipboard
|
||||
- name: Test v build-tools
|
||||
run: ./v -W build-tools
|
||||
- name: Test ./v doc clipboard
|
||||
run: ./v doc clipboard
|
||||
- name: All code is formatted
|
||||
run: ./v test-cleancode
|
||||
- name: Self tests
|
||||
run: ./v test-self
|
||||
- name: Test v->js
|
||||
|
@ -550,7 +594,9 @@ jobs:
|
|||
- name: Build examples
|
||||
run: ./v build-examples
|
||||
- name: v2 self compilation
|
||||
run: .\v.exe -o v2.exe cmd/v && .\v2.exe -o v3.exe cmd/v
|
||||
run: .\v.exe -o v2.exe cmd/v && .\v2.exe -o v3.exe cmd/v && .\v3.exe -o v4.exe cmd/v
|
||||
- name: v2 self compilation with -gc boehm
|
||||
run: .\v.exe -o v2.exe -gc boehm cmd/v && .\v2.exe -o v3.exe -gc boehm cmd/v && .\v3.exe -o v4.exe -gc boehm cmd/v
|
||||
|
||||
## ## tcc32
|
||||
## - name: Build with make.bat -tcc32
|
||||
|
@ -559,7 +605,7 @@ jobs:
|
|||
## .\v.exe wipe-cache
|
||||
## .\make.bat -tcc32
|
||||
## - name: Test new v.c
|
||||
## run: .\v.exe -o v.c cmd/v && .\thirdparty\tcc\tcc.exe -I ./thirdparty/stdatomic/win -Werror -g -w -ladvapi32 -bt10 v.c
|
||||
## run: .\v.exe -o v.c cmd/v && .\thirdparty\tcc\tcc.exe -Werror -g -w -ladvapi32 -bt10 v.c
|
||||
## - name: v doctor
|
||||
## run: ./v doctor
|
||||
##
|
||||
|
@ -593,40 +639,42 @@ jobs:
|
|||
## run: .\v.exe -o v2.exe cmd/v && .\v2.exe -o v3.exe cmd/v
|
||||
|
||||
|
||||
# ubuntu-autofree-selfcompile:
|
||||
# runs-on: ubuntu-20.04
|
||||
# timeout-minutes: 121
|
||||
# env:
|
||||
# VFLAGS: -cc gcc
|
||||
# steps:
|
||||
# - uses: actions/checkout@v2
|
||||
# - name: Build V
|
||||
# run: make -j4
|
||||
# - name: V self compilation with -autofree
|
||||
# run: ./v -o v2 -autofree cmd/v && ./v2 -o v3 -autofree cmd/v && ./v3 -o v4 -autofree cmd/v
|
||||
# ubuntu-autofree-selfcompile:
|
||||
# runs-on: ubuntu-20.04
|
||||
# if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
# timeout-minutes: 121
|
||||
# env:
|
||||
# VFLAGS: -cc gcc
|
||||
# steps:
|
||||
# - uses: actions/checkout@v2
|
||||
# - name: Build V
|
||||
# run: make -j4
|
||||
# - name: V self compilation with -autofree
|
||||
# run: ./v -o v2 -autofree cmd/v && ./v2 -o v3 -autofree cmd/v && ./v3 -o v4 -autofree cmd/v
|
||||
|
||||
|
||||
# ubuntu-musl:
|
||||
# runs-on: ubuntu-20.04
|
||||
# timeout-minutes: 121
|
||||
# env:
|
||||
# VFLAGS: -cc musl-gcc
|
||||
# V_CI_MUSL: 1
|
||||
# steps:
|
||||
# - uses: actions/checkout@v2
|
||||
# - uses: actions/setup-node@v1
|
||||
# with:
|
||||
# node-version: 12.x
|
||||
# - name: Install dependencies
|
||||
# run: |
|
||||
# sudo apt-get install --quiet -y musl musl-tools libssl-dev sqlite3 libsqlite3-dev valgrind
|
||||
# - name: Build v
|
||||
# run: echo $VFLAGS && make -j4 && ./v -cg -o v cmd/v
|
||||
# # - name: Test v binaries
|
||||
# # run: ./v build-vbinaries
|
||||
# ## - name: Test v->js
|
||||
# ## run: ./v -o hi.js examples/hello_v_js.v && node hi.js
|
||||
# - name: quick debug
|
||||
# run: ./v -stats vlib/strconv/format_test.v
|
||||
# - name: Self tests
|
||||
# run: ./v test-self
|
||||
# ubuntu-musl:
|
||||
# runs-on: ubuntu-20.04
|
||||
# if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
# timeout-minutes: 121
|
||||
# env:
|
||||
# VFLAGS: -cc musl-gcc
|
||||
# V_CI_MUSL: 1
|
||||
# steps:
|
||||
# - uses: actions/checkout@v2
|
||||
# - uses: actions/setup-node@v1
|
||||
# with:
|
||||
# node-version: 12.x
|
||||
# - name: Install dependencies
|
||||
# run: |
|
||||
# sudo apt-get install --quiet -y musl musl-tools libssl-dev sqlite3 libsqlite3-dev valgrind
|
||||
# - name: Build v
|
||||
# run: echo $VFLAGS && make -j4 && ./v -cg -o v cmd/v
|
||||
# # - name: Test v binaries
|
||||
# # run: ./v build-vbinaries
|
||||
# ## - name: Test v->js
|
||||
# ## run: ./v -o hi.js examples/hello_v_js.v && node hi.js
|
||||
# - name: quick debug
|
||||
# run: ./v -stats vlib/strconv/format_test.v
|
||||
# - name: Self tests
|
||||
# run: ./v test-self
|
||||
|
|
|
@ -11,10 +11,11 @@ on:
|
|||
jobs:
|
||||
ubuntu:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
VFLAGS: -cc tcc -no-retry-compilation
|
||||
B_CFLAGS: -g -std=gnu11 -I ./thirdparty/stdatomic/nix -w
|
||||
B_CFLAGS: -g -std=gnu11 -w
|
||||
B_LFLAGS: -lm -lpthread
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
@ -49,10 +50,11 @@ jobs:
|
|||
|
||||
macos:
|
||||
runs-on: macos-11
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
VFLAGS: -cc clang
|
||||
B_CFLAGS: -g -std=gnu11 -I ./thirdparty/stdatomic/nix -w
|
||||
B_CFLAGS: -g -std=gnu11 -w
|
||||
B_LFLAGS: -lm -lpthread
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
|
|
@ -12,6 +12,7 @@ jobs:
|
|||
|
||||
macos-cross:
|
||||
runs-on: macOS-latest
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 25
|
||||
env:
|
||||
VFLAGS: -cc clang
|
||||
|
@ -41,13 +42,9 @@ jobs:
|
|||
./v -os windows cmd/v
|
||||
./v -os windows examples/2048/2048.v
|
||||
|
||||
- name: Compile to raw Android (non-graphic) compatible
|
||||
run: |
|
||||
# Test that V can compile non-graphic app to Android compatible code *without* using the -apk flag
|
||||
./v -os android examples/toml.v
|
||||
|
||||
linux-cross:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 25
|
||||
env:
|
||||
VFLAGS: -cc tcc -no-retry-compilation
|
||||
|
@ -62,6 +59,7 @@ jobs:
|
|||
sudo apt-get install --quiet -y libssl-dev sqlite3 libsqlite3-dev
|
||||
sudo apt-get install --quiet -y mingw-w64 wine-stable winetricks
|
||||
## sudo apt-get install --quiet -y wine32
|
||||
|
||||
- name: Turn off the wine crash dialog
|
||||
run: winetricks nocrashdialog
|
||||
|
||||
|
@ -71,14 +69,14 @@ jobs:
|
|||
- name: v.c can be compiled and run with -os cross
|
||||
run: |
|
||||
./v -os cross -o /tmp/v.c cmd/v
|
||||
gcc -g -std=gnu11 -I ./thirdparty/stdatomic/nix -w -o v_from_vc /tmp/v.c -lm -lpthread
|
||||
gcc -g -std=gnu11 -w -o v_from_vc /tmp/v.c -lm -lpthread
|
||||
ls -lart v_from_vc
|
||||
./v_from_vc version
|
||||
|
||||
- name: v_win.c can be compiled and run with -os windows
|
||||
run: |
|
||||
./v -os windows -o /tmp/v_win.c cmd/v
|
||||
x86_64-w64-mingw32-gcc -I ./thirdparty/stdatomic/win /tmp/v_win.c -std=c99 -w -municode -o v_from_vc.exe
|
||||
./v -cc msvc -os windows -o /tmp/v_win.c cmd/v
|
||||
x86_64-w64-mingw32-gcc /tmp/v_win.c -std=c99 -w -municode -o v_from_vc.exe
|
||||
ls -lart v_from_vc.exe
|
||||
wine64 ./v_from_vc.exe version
|
||||
|
||||
|
@ -93,17 +91,10 @@ jobs:
|
|||
./v -os windows examples/2048/2048.v
|
||||
ls -lart examples/2048/2048.exe
|
||||
|
||||
- name: toml.v can be compiled to raw Android C
|
||||
run: |
|
||||
# Test that V can compile non-graphic app to Android compatible code *without* using the -apk flag
|
||||
./v -os android examples/toml.v
|
||||
|
||||
|
||||
windows-cross:
|
||||
runs-on: windows-2019
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 25
|
||||
env:
|
||||
VFLAGS: -cc msvc
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- name: Build
|
||||
|
@ -113,8 +104,8 @@ jobs:
|
|||
.\make.bat -msvc
|
||||
- name: TODO v_win.c can be compiled and run with -os windows
|
||||
run: |
|
||||
.\v.exe -os windows -showcc -o v2.exe cmd\v
|
||||
.\v.exe -os windows -o v_win.c cmd\v
|
||||
.\v.exe -os windows -cc msvc -showcc -o v2.exe cmd\v
|
||||
.\v.exe -os windows -cc msvc -o v_win.c cmd\v
|
||||
dir v2.exe
|
||||
dir v_win.c
|
||||
.\v2.exe version
|
||||
|
|
|
@ -16,9 +16,14 @@ on:
|
|||
paths:
|
||||
- '!**'
|
||||
- 'cmd/tools/vtest*'
|
||||
- 'cmd/tools/builders/**.v'
|
||||
- 'vlib/builtin/**.v'
|
||||
- 'vlib/strconv/**.v'
|
||||
- 'vlib/strings/**.v'
|
||||
- 'vlib/math/**.v'
|
||||
- 'vlib/math/big/**.v'
|
||||
- 'vlib/arrays/**.v'
|
||||
- 'vlib/datatypes/**.v'
|
||||
- 'vlib/os/**.v'
|
||||
- 'vlib/sync/**.v'
|
||||
- 'vlib/v/tests/**.v'
|
||||
|
@ -27,18 +32,25 @@ on:
|
|||
- 'vlib/v/parser/**.v'
|
||||
- 'vlib/v/checker/**.v'
|
||||
- 'vlib/v/gen/c/**.v'
|
||||
- 'vlib/v/builder/**.v'
|
||||
- 'vlib/v/cflag/**.v'
|
||||
- 'vlib/v/live/**.v'
|
||||
- 'vlib/v/util/**.v'
|
||||
- 'vlib/v/markused/**.v'
|
||||
- 'vlib/v/preludes/**.v'
|
||||
- 'vlib/v/embed_file/**.v'
|
||||
pull_request:
|
||||
paths:
|
||||
- '!**'
|
||||
- 'cmd/tools/vtest*'
|
||||
- 'cmd/tools/builders/**.v'
|
||||
- 'vlib/builtin/**.v'
|
||||
- 'vlib/strconv/**.v'
|
||||
- 'vlib/strings/**.v'
|
||||
- 'vlib/math/**.v'
|
||||
- 'vlib/math/big/**.v'
|
||||
- 'vlib/arrays/**.v'
|
||||
- 'vlib/datatypes/**.v'
|
||||
- 'vlib/os/**.v'
|
||||
- 'vlib/sync/**.v'
|
||||
- 'vlib/v/tests/**.v'
|
||||
|
@ -47,9 +59,11 @@ on:
|
|||
- 'vlib/v/parser/**.v'
|
||||
- 'vlib/v/checker/**.v'
|
||||
- 'vlib/v/gen/c/**.v'
|
||||
- 'vlib/v/builder/**.v'
|
||||
- 'vlib/v/cflag/**.v'
|
||||
- 'vlib/v/live/**.v'
|
||||
- 'vlib/v/util/**.v'
|
||||
- 'vlib/v/markused/**.v'
|
||||
- 'vlib/v/preludes/**.v'
|
||||
- 'vlib/v/embed_file/**.v'
|
||||
|
||||
|
@ -60,6 +74,7 @@ concurrency:
|
|||
jobs:
|
||||
tests-sanitize-undefined-clang:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 180
|
||||
env:
|
||||
VFLAGS: -cc clang
|
||||
|
@ -77,7 +92,9 @@ jobs:
|
|||
sudo apt-get install --quiet -y libfreetype6-dev libxi-dev libxcursor-dev libgl-dev
|
||||
sudo apt-get install clang
|
||||
- name: Build V
|
||||
run: make -j4 && ./v -cg -cstrict -o v cmd/v
|
||||
run: make && ./v -cg -cstrict -o v cmd/v
|
||||
- name: Ensure code is well formatted
|
||||
run: ./v test-cleancode
|
||||
- name: Self tests (-fsanitize=undefined)
|
||||
run: ./v -cflags "-fsanitize=undefined" -o v2 cmd/v && ./v2 -cflags -fsanitize=undefined test-self
|
||||
- name: Build examples (V compiled with -fsanitize=undefined)
|
||||
|
@ -85,6 +102,7 @@ jobs:
|
|||
|
||||
tests-sanitize-undefined-gcc:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 180
|
||||
env:
|
||||
VFLAGS: -cc gcc
|
||||
|
@ -101,7 +119,9 @@ jobs:
|
|||
sudo apt-get install --quiet -y postgresql libpq-dev libssl-dev sqlite3 libsqlite3-dev valgrind
|
||||
sudo apt-get install --quiet -y libfreetype6-dev libxi-dev libxcursor-dev libgl-dev
|
||||
- name: Build V
|
||||
run: make -j4 && ./v -cg -cstrict -o v cmd/v
|
||||
run: make && ./v -cg -cstrict -o v cmd/v
|
||||
- name: Ensure code is well formatted
|
||||
run: ./v test-cleancode
|
||||
- name: Self tests (-fsanitize=undefined)
|
||||
run: ./v -cflags "-fsanitize=undefined" -o v2 cmd/v && ./v2 -cflags -fsanitize=undefined test-self
|
||||
- name: Build examples (V compiled with -fsanitize=undefined)
|
||||
|
@ -109,6 +129,7 @@ jobs:
|
|||
|
||||
tests-sanitize-address-clang:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 180
|
||||
env:
|
||||
VFLAGS: -cc clang
|
||||
|
@ -126,7 +147,9 @@ jobs:
|
|||
sudo apt-get install --quiet -y libfreetype6-dev libxi-dev libxcursor-dev libgl-dev
|
||||
sudo apt-get install clang
|
||||
- name: Build V
|
||||
run: make -j4 && ./v -cg -cstrict -o v cmd/v
|
||||
run: make && ./v -cg -cstrict -o v cmd/v
|
||||
- name: Ensure code is well formatted
|
||||
run: ./v test-cleancode
|
||||
- name: Self tests (-fsanitize=address)
|
||||
run: ASAN_OPTIONS=detect_leaks=0 ./v -cflags "-fsanitize=address,pointer-compare,pointer-subtract" test-self
|
||||
- name: Self tests (V compiled with -fsanitize=address)
|
||||
|
@ -138,6 +161,7 @@ jobs:
|
|||
|
||||
tests-sanitize-address-msvc:
|
||||
runs-on: windows-2019
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 180
|
||||
env:
|
||||
VFLAGS: -cc msvc
|
||||
|
@ -154,6 +178,11 @@ jobs:
|
|||
echo $VFLAGS
|
||||
.\make.bat -msvc
|
||||
.\v.exe self
|
||||
- name: Ensure code is well formatted
|
||||
run: |
|
||||
.\v.exe fmt -verify vlib/builtin/ vlib/v/scanner/ vlib/v/parser/ vlib/v/gen/
|
||||
## TODO: check to see why `v test-cleancode` does not work with msvc on windows
|
||||
|
||||
## - name: Install dependencies
|
||||
## run: |
|
||||
## .\v.exe setup-freetype
|
||||
|
@ -164,6 +193,7 @@ jobs:
|
|||
|
||||
tests-sanitize-address-gcc:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 180
|
||||
env:
|
||||
VFLAGS: -cc gcc
|
||||
|
@ -181,7 +211,9 @@ jobs:
|
|||
sudo apt-get install --quiet -y libfreetype6-dev libxi-dev libxcursor-dev libgl-dev
|
||||
sudo apt-get install clang
|
||||
- name: Build V
|
||||
run: make -j4 && ./v -cg -cstrict -o v cmd/v
|
||||
run: make && ./v -cg -cstrict -o v cmd/v
|
||||
- name: Ensure code is well formatted
|
||||
run: ./v test-cleancode
|
||||
- name: Self tests (-fsanitize=address)
|
||||
run: ASAN_OPTIONS=detect_leaks=0 ./v -cflags -fsanitize=address test-self
|
||||
- name: Self tests (V compiled with -fsanitize=address)
|
||||
|
@ -193,9 +225,10 @@ jobs:
|
|||
|
||||
tests-sanitize-memory-clang:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 180
|
||||
env:
|
||||
VFLAGS: -cc clang
|
||||
VFLAGS: -cc clang -gc none
|
||||
VJOBS: 1
|
||||
VTEST_SHOW_START: 1
|
||||
steps:
|
||||
|
@ -210,11 +243,14 @@ jobs:
|
|||
sudo apt-get install --quiet -y libfreetype6-dev libxi-dev libxcursor-dev libgl-dev
|
||||
sudo apt-get install clang
|
||||
- name: Build V
|
||||
run: make -j4 && ./v -cc clang -cg -cstrict -o v cmd/v
|
||||
run: make && ./v -cc clang -cg -cstrict -o v cmd/v
|
||||
- name: Ensure code is well formatted
|
||||
run: ./v test-cleancode
|
||||
- name: Self tests (-fsanitize=memory)
|
||||
run: ./v -cflags -fsanitize=memory test-self
|
||||
- name: Self tests (V compiled with -fsanitize=memory)
|
||||
run:
|
||||
./v -cflags -fsanitize=memory -o v cmd/v && ./v -cc tcc test-self -msan-compiler
|
||||
run: |
|
||||
./v -cflags -fsanitize=memory -o v cmd/v
|
||||
./v -cc tcc test-self -msan-compiler
|
||||
- name: Build examples (V compiled with -fsanitize=memory)
|
||||
run: ./v build-examples
|
||||
|
|
|
@ -0,0 +1,36 @@
|
|||
name: vlang benchmarks
|
||||
|
||||
on:
|
||||
push:
|
||||
paths-ignore:
|
||||
- "**.md"
|
||||
pull_request:
|
||||
paths-ignore:
|
||||
- "**.md"
|
||||
|
||||
jobs:
|
||||
run:
|
||||
name: Run
|
||||
runs-on: ubuntu-latest
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v2
|
||||
- name: Install google benchmark
|
||||
run: |
|
||||
git clone https://github.com/google/benchmark.git
|
||||
cd benchmark
|
||||
cmake -E make_directory "build"
|
||||
cmake -E chdir "build" cmake -DBENCHMARK_DOWNLOAD_DEPENDENCIES=on -DCMAKE_BUILD_TYPE=Release ../
|
||||
sudo cmake --build "build" --config Release --target install
|
||||
- name: Run V benchmark
|
||||
run: |
|
||||
make
|
||||
sudo ./v symlink
|
||||
git clone https://github.com/vincenzopalazzo/benchmarks.git
|
||||
cd benchmarks
|
||||
make vdep && make v
|
||||
- uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: vlang-benchmark
|
||||
path: benchmarks/vlang/*.json
|
|
@ -16,7 +16,8 @@ jobs:
|
|||
|
||||
alpine-docker-musl-gcc:
|
||||
runs-on: ubuntu-20.04
|
||||
timeout-minutes: 121
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 181
|
||||
container:
|
||||
# Alpine docker pre-built container
|
||||
image: thevlang/vlang:alpine-build
|
||||
|
@ -42,18 +43,22 @@ jobs:
|
|||
- name: Build V
|
||||
run: CC=gcc make
|
||||
|
||||
- name: Test V fixed tests
|
||||
run: ./v test-self
|
||||
- name: All code is formatted
|
||||
run: ./v test-cleancode
|
||||
|
||||
- name: Run only essential tests
|
||||
run: VTEST_JUST_ESSENTIAL=1 ./v test-self
|
||||
|
||||
ubuntu-docker-musl:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
container:
|
||||
image: thevlang/vlang:ubuntu-build
|
||||
env:
|
||||
V_CI_MUSL: 1
|
||||
V_CI_UBUNTU_MUSL: 1
|
||||
VFLAGS: -cc musl-gcc
|
||||
VFLAGS: -cc musl-gcc -gc none
|
||||
volumes:
|
||||
- ${{github.workspace}}:/opt/vlang
|
||||
|
||||
|
@ -69,5 +74,9 @@ jobs:
|
|||
echo $VFLAGS
|
||||
./v cmd/tools/test_if_v_test_system_works.v
|
||||
./cmd/tools/test_if_v_test_system_works
|
||||
|
||||
- name: All code is formatted
|
||||
run: ./v test-cleancode
|
||||
|
||||
- name: Test V fixed tests
|
||||
run: ./v test-self
|
||||
|
|
|
@ -25,12 +25,13 @@ jobs:
|
|||
.\v.exe setup-freetype
|
||||
.\.github\workflows\windows-install-sqlite.bat
|
||||
- name: v doctor
|
||||
run: |
|
||||
./v doctor
|
||||
run: ./v doctor
|
||||
- name: Verify `v test` works
|
||||
run: |
|
||||
./v cmd/tools/test_if_v_test_system_works.v
|
||||
./cmd/tools/test_if_v_test_system_works
|
||||
- name: All code is formatted
|
||||
run: ./v test-cleancode
|
||||
- name: Self tests
|
||||
run: |
|
||||
./v -cg cmd\tools\vtest-self.v
|
||||
|
|
|
@ -21,7 +21,7 @@ jobs:
|
|||
runs-on: ubuntu-20.04
|
||||
timeout-minutes: 5
|
||||
env:
|
||||
MOPTIONS: --no-line-numbers --relative-paths --exclude /vlib/v/ --exclude /builtin/linux_bare/ --exclude /testdata/ --exclude /tests/ vlib/
|
||||
MOPTIONS: --relative-paths --exclude /vlib/v/ --exclude /builtin/linux_bare/ --exclude /testdata/ --exclude /tests/
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- name: Build V
|
||||
|
@ -35,14 +35,4 @@ jobs:
|
|||
|
||||
- name: Check against parent commit
|
||||
run: |
|
||||
./v run cmd/tools/missdoc.v $MOPTIONS | sort > /tmp/n_v.txt
|
||||
cd pv/ && ../v run ../cmd/tools/missdoc.v $MOPTIONS | sort > /tmp/o_v.txt
|
||||
count_new=$(cat /tmp/n_v.txt | wc -l)
|
||||
count_old=$(cat /tmp/o_v.txt | wc -l)
|
||||
echo "new pubs: $count_new | old pubs: $count_old"
|
||||
echo "new head: $(head -n1 /tmp/n_v.txt)"
|
||||
echo "old head: $(head -n1 /tmp/o_v.txt)"
|
||||
if [[ ${count_new} -gt ${count_old} ]]; then
|
||||
echo "The following $((count_new-count_old)) function(s) are introduced with no documentation:"
|
||||
diff /tmp/n_v.txt /tmp/o_v.txt ## diff does exit(1) when files are different
|
||||
fi
|
||||
./v missdoc --diff $MOPTIONS pv/vlib vlib
|
||||
|
|
|
@ -10,6 +10,7 @@ on:
|
|||
jobs:
|
||||
build-vc:
|
||||
runs-on: ubuntu-latest
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
env:
|
||||
VREPO: github.com/vlang/vc.git
|
||||
steps:
|
||||
|
@ -31,7 +32,7 @@ jobs:
|
|||
rm -rf vc/v.c vc/v_win.c
|
||||
|
||||
./v -o vc/v.c -os cross cmd/v
|
||||
./v -o vc/v_win.c -os windows cmd/v
|
||||
./v -o vc/v_win.c -os windows -cc msvc cmd/v
|
||||
|
||||
sed -i "1s/^/#define V_COMMIT_HASH \"$COMMIT_HASH\"\n/" vc/v.c
|
||||
sed -i "1s/^/#define V_COMMIT_HASH \"$COMMIT_HASH\"\n/" vc/v_win.c
|
||||
|
|
|
@ -11,6 +11,7 @@ on:
|
|||
jobs:
|
||||
gg-regressions:
|
||||
runs-on: ubuntu-18.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 10
|
||||
env:
|
||||
VFLAGS: -cc tcc
|
||||
|
@ -20,25 +21,24 @@ jobs:
|
|||
uses: actions/checkout@v2
|
||||
|
||||
- name: Build local v
|
||||
run: make -j4
|
||||
run: make
|
||||
|
||||
- uses: openrndr/setup-opengl@v1.1
|
||||
|
||||
- name: Setup dependencies
|
||||
run: |
|
||||
# imagemagick : convert, mogrify
|
||||
# xvfb : xvfb (installed by openrndr/setup-opengl@v1.1)
|
||||
# openimageio-tools : idiff
|
||||
# libxcursor-dev libxi-dev : V gfx deps
|
||||
# mesa-common-dev : For headless rendering
|
||||
# freeglut3-dev : Fixes graphic apps compilation with tcc
|
||||
sudo apt-get update
|
||||
sudo apt-get install imagemagick openimageio-tools mesa-common-dev libxcursor-dev libxi-dev freeglut3-dev
|
||||
wget https://raw.githubusercontent.com/tremby/imgur.sh/c98345d/imgur.sh
|
||||
git clone https://github.com/Larpon/gg-regression-images gg-regression-images
|
||||
chmod +x ./imgur.sh
|
||||
|
||||
- uses: openrndr/setup-opengl@v1.1
|
||||
- uses: actions/checkout@v2
|
||||
with:
|
||||
repository: Larpon/gg-regression-images
|
||||
path: gg-regression-images
|
||||
|
||||
- name: Sample and compare
|
||||
id: compare
|
||||
continue-on-error: true
|
||||
|
@ -50,4 +50,5 @@ jobs:
|
|||
if: steps.compare.outcome != 'success'
|
||||
run: |
|
||||
./imgur.sh /tmp/fail.png
|
||||
./imgur.sh /tmp/diff.png
|
||||
exit 1
|
|
@ -0,0 +1,36 @@
|
|||
name: native backend CI
|
||||
|
||||
on:
|
||||
push:
|
||||
paths-ignore:
|
||||
- "**.md"
|
||||
pull_request:
|
||||
paths-ignore:
|
||||
- "**.md"
|
||||
|
||||
concurrency:
|
||||
group: native-backend-ci-${{ github.event.pull_request.number || github.sha }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
native-backend:
|
||||
strategy:
|
||||
matrix:
|
||||
os: [ubuntu-18.04, ubuntu-20.04, macos-10.15, macos-11, macos-12, windows-2016, windows-2019, windows-2022]
|
||||
runs-on: ${{ matrix.os }}
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
- name: Build V with make.bat
|
||||
if: ${{ startsWith(matrix.os, 'windows') }}
|
||||
run: |
|
||||
.\make.bat
|
||||
.\v.exe symlink -githubci
|
||||
- name: Build V with make
|
||||
if: ${{ !startsWith(matrix.os, 'windows') }}
|
||||
run: |
|
||||
make
|
||||
./v symlink -githubci
|
||||
|
||||
- name: Test the native backend
|
||||
run: v test vlib/v/gen/native/
|
|
@ -15,6 +15,7 @@ concurrency:
|
|||
jobs:
|
||||
no-gpl-by-accident:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 15
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
@ -24,6 +25,7 @@ jobs:
|
|||
|
||||
code-formatting:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 15
|
||||
env:
|
||||
VFLAGS: -cc gcc
|
||||
|
@ -40,6 +42,7 @@ jobs:
|
|||
|
||||
performance-regressions:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 15
|
||||
env:
|
||||
VFLAGS: -cc gcc
|
||||
|
@ -64,6 +67,7 @@ jobs:
|
|||
|
||||
misc-tooling:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
env:
|
||||
VFLAGS: -cc tcc -no-retry-compilation
|
||||
|
@ -71,15 +75,24 @@ jobs:
|
|||
- uses: actions/checkout@v2
|
||||
with:
|
||||
fetch-depth: 10
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install --quiet -y libssl-dev sqlite3 libsqlite3-dev valgrind
|
||||
sudo apt-get install --quiet -y libfreetype6-dev libxi-dev libxcursor-dev libgl-dev
|
||||
sudo apt-get install --quiet -y xfonts-75dpi xfonts-base
|
||||
sudo apt-get install --quiet -y libsodium-dev libssl-dev sqlite3 libsqlite3-dev postgresql libpq-dev valgrind
|
||||
sudo apt-get install --quiet -y libfreetype6-dev libxi-dev libxcursor-dev libgl-dev xfonts-75dpi xfonts-base
|
||||
sudo apt-get install --quiet -y g++-9
|
||||
|
||||
- name: Build v
|
||||
run: make
|
||||
|
||||
- name: g++ version
|
||||
run: g++-9 --version
|
||||
- name: V self compilation with g++
|
||||
run: ./v -cc g++-9 -no-std -cflags -std=c++11 -o v2 cmd/v && ./v2 -cc g++-9 -no-std -cflags -std=c++11 -o v3 cmd/v
|
||||
## - name: Running tests with g++
|
||||
## run: ./v -cc g++-9 test-self
|
||||
|
||||
- name: Ensure V can be compiled with -autofree
|
||||
run: ./v -autofree -o v2 cmd/v ## NB: this does not mean it runs, but at least keeps it from regressing
|
||||
|
||||
|
@ -97,25 +110,11 @@ jobs:
|
|||
echo "compiling shader $f.glsl ..."; \
|
||||
./sokol-shdc --input $f.glsl --output $f.h --slang glsl330 ; \
|
||||
done
|
||||
for vfile in examples/sokol/0?*/*.v; do echo "compiling $vfile ..."; ./v $vfile ; done
|
||||
|
||||
- name: Install C++ dependencies
|
||||
run: |
|
||||
sudo apt-get install --quiet -y postgresql libpq-dev libssl-dev sqlite3 libsqlite3-dev
|
||||
sudo apt-get install --quiet -y libfreetype6-dev libxi-dev libxcursor-dev libgl-dev
|
||||
sudo apt-get install --quiet -y valgrind g++-9
|
||||
- name: Build V
|
||||
run: make -j4
|
||||
- name: g++ version
|
||||
run: g++-9 --version
|
||||
- name: V self compilation with g++
|
||||
run: ./v -cc g++-9 -no-std -cflags -std=c++11 -o v2 cmd/v && ./v2 -cc g++-9 -no-std -cflags -std=c++11 -o v3 cmd/v
|
||||
## - name: Running tests with g++
|
||||
## run: ./v -cc g++-9 test-self
|
||||
|
||||
./v should-compile-all examples/sokol/*.v examples/sokol/0?*/*.v
|
||||
|
||||
parser-silent:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
@ -152,105 +151,3 @@ jobs:
|
|||
./v test-parser -S examples/regex_example_fuzz.v
|
||||
./v test-parser -S examples/2048/2048_fuzz.v
|
||||
|
||||
v-apps-compile:
|
||||
runs-on: ubuntu-20.04
|
||||
timeout-minutes: 121
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- name: Build V
|
||||
run: make && sudo ./v symlink
|
||||
|
||||
- name: Install dependencies
|
||||
run: sudo apt-get install --quiet -y libgc-dev
|
||||
|
||||
## vls
|
||||
- name: Clone VLS
|
||||
run: git clone --depth 1 https://github.com/vlang/vls
|
||||
- name: Build VLS
|
||||
run: pushd vls; v cmd/vls ; popd
|
||||
- name: Build VLS with -prod
|
||||
run: pushd vls; v -prod cmd/vls; popd
|
||||
- name: Build VLS with -gc boehm -skip-unused
|
||||
run: pushd vls; v -gc boehm -skip-unused cmd/vls; popd
|
||||
|
||||
## vsl
|
||||
- name: Clone VSL
|
||||
run: git clone --depth 1 https://github.com/vlang/vsl ~/.vmodules/vsl
|
||||
- name: Install dependencies
|
||||
run: sudo apt-get install --quiet -y --no-install-recommends gfortran liblapacke-dev libopenblas-dev libgc-dev
|
||||
- name: Execute Tests using Pure V Backend
|
||||
run: ~/.vmodules/vsl/bin/test
|
||||
- name: Execute Tests using Pure V Backend with Pure V Math
|
||||
run: ~/.vmodules/vsl/bin/test --use-cblas
|
||||
- name: Execute Tests using Pure V Backend and Garbage Collection enabled
|
||||
run: ~/.vmodules/vsl/bin/test --use-gc boehm
|
||||
- name: Execute Tests using Pure V Backend with Pure V Math and Garbage Collection enabled
|
||||
run: ~/.vmodules/vsl/bin/test --use-cblas --use-gc boehm
|
||||
|
||||
## vtl
|
||||
- name: Clone VTL
|
||||
run: git clone --depth 1 https://github.com/vlang/vtl ~/.vmodules/vtl
|
||||
- name: Install dependencies
|
||||
run: sudo apt-get install --quiet -y --no-install-recommends gfortran liblapacke-dev libopenblas-dev libgc-dev
|
||||
- name: Execute Tests using Pure V Backend
|
||||
run: ~/.vmodules/vtl/bin/test
|
||||
- name: Execute Tests using Pure V Backend with Pure V Math
|
||||
run: ~/.vmodules/vtl/bin/test --use-cblas
|
||||
- name: Execute Tests using Pure V Backend and Garbage Collection enabled
|
||||
run: ~/.vmodules/vtl/bin/test --use-gc boehm
|
||||
- name: Execute Tests using Pure V Backend with Pure V Math and Garbage Collection enabled
|
||||
run: ~/.vmodules/vtl/bin/test --use-cblas --use-gc boehm
|
||||
|
||||
## vab
|
||||
- name: Clone vab
|
||||
run: git clone --depth 1 https://github.com/vlang/vab
|
||||
- name: Build vab
|
||||
run: cd vab; ../v ./vab.v ; cd ..
|
||||
- name: Build vab with -gc boehm -skip-unused
|
||||
run: cd vab; ../v -gc boehm -skip-unused ./vab.v ; cd ..
|
||||
|
||||
## gitly
|
||||
- name: Install markdown
|
||||
run: ./v install markdown
|
||||
- name: Build Gitly
|
||||
run: |
|
||||
git clone --depth 1 https://github.com/vlang/gitly
|
||||
cd gitly
|
||||
../v .
|
||||
# ./gitly -ci_run
|
||||
../v -autofree .
|
||||
../v -o x tests/first_run.v
|
||||
./x
|
||||
cd ..
|
||||
|
||||
## vex
|
||||
- name: Install Vex dependencies
|
||||
run: sudo apt-get install --quiet -y libssl-dev sqlite3 libsqlite3-dev
|
||||
- name: Install Vex
|
||||
run: mkdir -p ~/.vmodules/nedpals; git clone https://github.com/nedpals/vex ~/.vmodules/nedpals/vex
|
||||
- name: Compile the simple Vex example
|
||||
run: ./v ~/.vmodules/nedpals/vex/examples/simple_example.v
|
||||
- name: Compile the simple Vex example with -gc boehm -skip-unused
|
||||
run: ./v -gc boehm -skip-unused ~/.vmodules/nedpals/vex/examples/simple_example.v
|
||||
- name: Run Vex Tests
|
||||
run: ./v test ~/.vmodules/nedpals/vex
|
||||
|
||||
## vpm modules
|
||||
- name: Install UI through VPM
|
||||
run: ./v install ui
|
||||
|
||||
## libsodium
|
||||
- name: Install libsodium-dev package
|
||||
run: sudo apt-get install --quiet -y libsodium-dev
|
||||
- name: Installl the libsodium wrapper through VPM
|
||||
run: ./v install libsodium
|
||||
- name: Test libsodium
|
||||
run: VJOBS=1 ./v -stats test ~/.vmodules/libsodium
|
||||
|
||||
## Go2V
|
||||
- name: Clone & Build go2v
|
||||
run: git clone --depth=1 https://github.com/vlang/go2v go2v/
|
||||
- name: Build go2v
|
||||
run: ./v go2v/
|
||||
- name: Run tests for go2v
|
||||
run: VJOBS=1 ./v -stats test go2v/
|
||||
|
|
|
@ -16,6 +16,7 @@ jobs:
|
|||
|
||||
space-paths-linux:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
MY_V_PATH: '你好 my $path, @с интервали'
|
||||
|
@ -41,6 +42,7 @@ jobs:
|
|||
|
||||
space-paths-macos:
|
||||
runs-on: macOS-latest
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
MY_V_PATH: '你好 my $path, @с интервали'
|
||||
|
@ -69,6 +71,7 @@ jobs:
|
|||
|
||||
space-paths-windows:
|
||||
runs-on: windows-2022
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
MY_V_PATH: 'path with some $punctuation, and some spaces'
|
||||
|
|
|
@ -2,57 +2,56 @@ name: Periodic
|
|||
|
||||
on:
|
||||
schedule:
|
||||
- cron: '0 */2 * * *'
|
||||
- cron: '0 */6 * * *'
|
||||
|
||||
jobs:
|
||||
network-tests-ubuntu:
|
||||
runs-on: ubuntu-20.04
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
V_CI_PERIODIC: 1
|
||||
V_CI_PERIODIC: 1
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- name: Install dependencies
|
||||
run: sudo apt-get install --quiet -y libssl-dev sqlite3 libsqlite3-dev valgrind
|
||||
- name: Build v
|
||||
run: make -j4
|
||||
- name: Symlink V
|
||||
run: sudo ./v symlink
|
||||
## - name: Run network tests
|
||||
## run: ./v -d network test vlib/net
|
||||
- uses: actions/checkout@v2
|
||||
- name: Install dependencies 1
|
||||
run: sudo apt-get install --quiet -y libssl-dev sqlite3 libsqlite3-dev
|
||||
- name: Build v
|
||||
run: make
|
||||
- name: Symlink V
|
||||
run: sudo ./v symlink
|
||||
## - name: Run network tests
|
||||
## run: ./v -d network test vlib/net
|
||||
|
||||
|
||||
network-tests-macos:
|
||||
runs-on: macOS-latest
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
V_CI_PERIODIC: 1
|
||||
V_CI_PERIODIC: 1
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- name: Setup openssl library path
|
||||
run: export LIBRARY_PATH="$LIBRARY_PATH:/usr/local/opt/openssl/lib/"
|
||||
- name: Build V
|
||||
run: make -j4
|
||||
- name: Symlink V
|
||||
run: sudo ./v symlink
|
||||
- name: Ensure thirdparty/cJSON/cJSON.o is compiled, before running tests.
|
||||
run: ./v examples/json.v
|
||||
## - name: Run network tests
|
||||
## run: ./v -d network test vlib/net
|
||||
|
||||
- uses: actions/checkout@v2
|
||||
- name: Setup openssl library path
|
||||
run: export LIBRARY_PATH="$LIBRARY_PATH:/usr/local/opt/openssl/lib/"
|
||||
- name: Build V
|
||||
run: make
|
||||
- name: Symlink V
|
||||
run: sudo ./v symlink
|
||||
- name: Ensure thirdparty/cJSON/cJSON.o is compiled, before running tests.
|
||||
run: ./v examples/json.v
|
||||
## - name: Run network tests
|
||||
## run: ./v -d network test vlib/net
|
||||
|
||||
network-windows-msvc:
|
||||
runs-on: windows-2019
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
V_CI_PERIODIC: 1
|
||||
VFLAGS: -cc msvc
|
||||
V_CI_PERIODIC: 1
|
||||
VFLAGS: -cc msvc
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- name: Build
|
||||
run: |
|
||||
echo %VFLAGS%
|
||||
echo $VFLAGS
|
||||
.\make.bat -msvc
|
||||
## - name: Run network tests
|
||||
## run: .\v.exe -d network test vlib/net
|
||||
- uses: actions/checkout@v2
|
||||
- name: Build
|
||||
run: |
|
||||
echo %VFLAGS%
|
||||
echo $VFLAGS
|
||||
.\make.bat -msvc
|
||||
## - name: Run network tests
|
||||
## run: .\v.exe -d network test vlib/net
|
||||
|
|
|
@ -11,6 +11,7 @@ on:
|
|||
jobs:
|
||||
v-compiles-sdl-examples:
|
||||
runs-on: ubuntu-18.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
VFLAGS: -cc tcc
|
||||
|
|
|
@ -11,6 +11,7 @@ on:
|
|||
jobs:
|
||||
toml-module-pass-external-test-suites:
|
||||
runs-on: ubuntu-18.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 30
|
||||
env:
|
||||
TOML_BS_TESTS_PATH: vlib/toml/tests/testdata/burntsushi/toml-test
|
||||
|
|
|
@ -0,0 +1,137 @@
|
|||
name: V Apps and Modules
|
||||
|
||||
on:
|
||||
push:
|
||||
paths-ignore:
|
||||
- "**.md"
|
||||
pull_request:
|
||||
paths-ignore:
|
||||
- "**.md"
|
||||
|
||||
concurrency:
|
||||
group: build-v-apps-and-modules-${{ github.event.pull_request.number || github.sha }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
v-apps-compile:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- name: Build V
|
||||
run: make && sudo ./v symlink
|
||||
|
||||
- name: Install dependencies
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install --quiet -y libgc-dev libsodium-dev libssl-dev sqlite3 libsqlite3-dev valgrind libfreetype6-dev libxi-dev libxcursor-dev libgl-dev xfonts-75dpi xfonts-base
|
||||
sudo apt-get install --quiet -y --no-install-recommends gfortran liblapacke-dev libopenblas-dev
|
||||
|
||||
- name: Build V Language Server (VLS)
|
||||
run: |
|
||||
echo "Clone VLS"
|
||||
git clone --depth 1 https://github.com/vlang/vls /tmp/vls
|
||||
echo "Build VLS"
|
||||
v /tmp/vls/cmd/vls
|
||||
echo "Build VLS with -prod"
|
||||
v -prod /tmp/vls/cmd/vls
|
||||
echo "Build VLS with -gc boehm -skip-unused"
|
||||
v -gc boehm -skip-unused /tmp/vls/cmd/vls
|
||||
|
||||
- name: Build V Coreutils
|
||||
run: |
|
||||
echo "Clone Coreutils"
|
||||
git clone --depth 1 https://github.com/vlang/coreutils /tmp/coreutils
|
||||
echo "Build Coreutils"
|
||||
cd /tmp/coreutils; make
|
||||
|
||||
- name: Build VAB
|
||||
run: |
|
||||
echo "Install VAB"
|
||||
v install vab
|
||||
echo "Build vab"
|
||||
v ~/.vmodules/vab
|
||||
echo "Build vab with -gc boehm -skip-unused"
|
||||
v -gc boehm -skip-unused ~/.vmodules/vab
|
||||
|
||||
- name: Build Gitly
|
||||
run: |
|
||||
echo "Install markdown"
|
||||
v install markdown
|
||||
echo "Clone Gitly"
|
||||
git clone https://github.com/vlang/gitly /tmp/gitly
|
||||
echo "Build Gitly"
|
||||
v /tmp/gitly
|
||||
echo "Build Gitly with -autofree"
|
||||
v -autofree /tmp/gitly
|
||||
echo "Run first_run.v"
|
||||
v run /tmp/gitly/tests/first_run.v
|
||||
# /tmp/gitly/gitly -ci_run
|
||||
|
||||
- name: Build libsodium
|
||||
run: |
|
||||
echo "Install the libsodium wrapper"
|
||||
v install libsodium
|
||||
echo "Test libsodium"
|
||||
VJOBS=1 v test ~/.vmodules/libsodium
|
||||
|
||||
- name: Build VEX
|
||||
run: |
|
||||
echo "Install Vex"
|
||||
v install nedpals.vex
|
||||
echo "Compile all of the Vex examples"
|
||||
v should-compile-all ~/.vmodules/nedpals/vex/examples
|
||||
echo "Compile the simple Vex example with -gc boehm -skip-unused"
|
||||
v -gc boehm -skip-unused ~/.vmodules/nedpals/vex/examples/simple_example.v
|
||||
echo "Run Vex Tests"
|
||||
v test ~/.vmodules/nedpals/vex
|
||||
|
||||
- name: Build go2v
|
||||
run: |
|
||||
echo "Clone Go2V"
|
||||
git clone --depth=1 https://github.com/vlang/go2v /tmp/go2v/
|
||||
echo "Build Go2V"
|
||||
v /tmp/go2v/
|
||||
echo "Run Go2V tests"
|
||||
VJOBS=1 v -stats test /tmp/go2v/
|
||||
|
||||
- name: Build vlang/pdf
|
||||
run: |
|
||||
v install pdf
|
||||
echo "PDF examples should compile"
|
||||
v should-compile-all ~/.vmodules/pdf/examples
|
||||
|
||||
- name: Install UI through VPM
|
||||
run: |
|
||||
echo "Official VPM modules should be installable"
|
||||
v install ui
|
||||
echo "Examples of UI should compile"
|
||||
v ~/.vmodules/ui/examples/build_examples.vsh
|
||||
|
||||
- name: Build VSL
|
||||
run: |
|
||||
echo "Install VSL"
|
||||
v install vsl
|
||||
echo "Execute Tests using Pure V Backend"
|
||||
~/.vmodules/vsl/bin/test
|
||||
echo "Execute Tests using Pure V Backend with Pure V Math"
|
||||
~/.vmodules/vsl/bin/test --use-cblas
|
||||
echo "Execute Tests using Pure V Backend and Garbage Collection enabled"
|
||||
~/.vmodules/vsl/bin/test --use-gc boehm
|
||||
echo "Execute Tests using Pure V Backend with Pure V Math and Garbage Collection enabled"
|
||||
~/.vmodules/vsl/bin/test --use-cblas --use-gc boehm
|
||||
|
||||
- name: Build VTL
|
||||
run: |
|
||||
echo "Install VTL"
|
||||
v install vtl
|
||||
echo "Install dependencies"
|
||||
echo "Execute Tests using Pure V Backend"
|
||||
~/.vmodules/vtl/bin/test
|
||||
echo "Execute Tests using Pure V Backend with Pure V Math"
|
||||
~/.vmodules/vtl/bin/test --use-cblas
|
||||
echo "Execute Tests using Pure V Backend and Garbage Collection enabled"
|
||||
~/.vmodules/vtl/bin/test --use-gc boehm
|
||||
echo "Execute Tests using Pure V Backend with Pure V Math and Garbage Collection enabled"
|
||||
~/.vmodules/vtl/bin/test --use-cblas --use-gc boehm
|
|
@ -11,6 +11,7 @@ on:
|
|||
jobs:
|
||||
vab-compiles-v-examples:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
env:
|
||||
VAB_FLAGS: --api 30 --build-tools 29.0.0 -v 3
|
||||
|
@ -24,20 +25,14 @@ jobs:
|
|||
- name: Build V
|
||||
run: make && sudo ./v symlink
|
||||
|
||||
- name: Checkout vab
|
||||
uses: actions/checkout@v2
|
||||
with:
|
||||
repository: vlang/vab
|
||||
path: vab
|
||||
|
||||
- name: Build vab
|
||||
- name: Install vab
|
||||
run: |
|
||||
cd vab
|
||||
v -g vab.v
|
||||
sudo ln -s $(pwd)/vab /usr/local/bin/vab
|
||||
v install vab
|
||||
v -g ~/.vmodules/vab
|
||||
sudo ln -s ~/.vmodules/vab/vab /usr/local/bin/vab
|
||||
|
||||
- name: Run tests
|
||||
run: v test vab
|
||||
run: v test ~/.vmodules/vab
|
||||
|
||||
- name: Run vab --help
|
||||
run: vab --help
|
||||
|
@ -53,3 +48,27 @@ jobs:
|
|||
safe_name=$(echo "$example" | sed 's%/%-%' | sed 's%\.%-%' )
|
||||
vab examples/$example -o apks/$safe_name.apk
|
||||
done
|
||||
|
||||
v-compiles-os-android:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 10
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- name: Build V
|
||||
run: make && sudo ./v symlink
|
||||
|
||||
- name: Install vab
|
||||
run: |
|
||||
v install vab
|
||||
v -g ~/.vmodules/vab
|
||||
sudo ln -s ~/.vmodules/vab/vab /usr/local/bin/vab
|
||||
|
||||
- name: Run vab --help
|
||||
run: vab --help
|
||||
|
||||
- name: Run vab doctor
|
||||
run: vab doctor
|
||||
|
||||
- name: Check `v -os android` *without* -apk flag
|
||||
run: .github/workflows/android_cross_compile.vsh
|
||||
|
|
|
@ -13,6 +13,9 @@ on:
|
|||
jobs:
|
||||
vinix-build:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
env:
|
||||
VFLAGS: -gc none
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
|
||||
|
|
|
@ -11,6 +11,7 @@ on:
|
|||
jobs:
|
||||
websocket_tests:
|
||||
runs-on: ubuntu-20.04
|
||||
if: github.event_name != 'push' || github.event.ref == 'refs/heads/master' || github.event.repository.full_name != 'vlang/v'
|
||||
timeout-minutes: 121
|
||||
env:
|
||||
VFLAGS: -cc tcc -no-retry-compilation
|
||||
|
|
|
@ -1,12 +1,11 @@
|
|||
@echo off
|
||||
|
||||
curl -L https://www.sqlite.org/2020/sqlite-amalgamation-3320300.zip -o sqlite-amalgamation-3320300.zip
|
||||
curl -L https://www.sqlite.org/2022/sqlite-amalgamation-3380200.zip -o sqlite-amalgamation-3380200.zip
|
||||
|
||||
unzip sqlite-amalgamation-3320300.zip -d thirdparty\
|
||||
unzip sqlite-amalgamation-3380200.zip -d thirdparty\
|
||||
|
||||
del thirdparty\sqlite-amalgamation-3320300\shell.c
|
||||
del thirdparty\sqlite-amalgamation-3380200\shell.c
|
||||
|
||||
move /y thirdparty\sqlite-amalgamation-3320300 thirdparty\sqlite
|
||||
move /y thirdparty\sqlite-amalgamation-3380200 thirdparty\sqlite
|
||||
|
||||
dir thirdparty\sqlite
|
||||
|
||||
|
|
|
@ -0,0 +1,61 @@
|
|||
platform: 'linux/amd64'
|
||||
branches: ['master']
|
||||
|
||||
pipeline:
|
||||
gen-vc:
|
||||
# This is what the official CI uses as well
|
||||
image: 'ubuntu:latest'
|
||||
secrets:
|
||||
- deploy_key
|
||||
commands:
|
||||
# Install necessary dependencies
|
||||
- apt-get update -y && apt-get install openssh-client git build-essential -y
|
||||
# Build the compiler
|
||||
- make
|
||||
# Run ssh-agent
|
||||
- eval $(ssh-agent -s)
|
||||
# Add ssh key
|
||||
- echo "$DEPLOY_KEY" | tr -d '\r' | ssh-add -
|
||||
# Create ssh dir with proper permissions
|
||||
- mkdir -p ~/.ssh
|
||||
- chmod 700 ~/.ssh
|
||||
# Configure git credentials
|
||||
- git config --global user.email 'vbot@rustybever.be'
|
||||
- git config --global user.name 'vbot'
|
||||
# Verify SSH keys
|
||||
- ssh-keyscan git.rustybever.be > ~/.ssh/known_hosts
|
||||
|
||||
# The following is copied over from the official repo's CI
|
||||
# https://github.com/vlang/v/blob/master/.github/workflows/gen_vc.yml
|
||||
- export "COMMIT_HASH=$(git rev-parse --short HEAD)"
|
||||
- export "COMMIT_MSG=$(git log -1 --oneline --pretty='%s' HEAD)"
|
||||
- rm -rf vc
|
||||
- git clone --depth=1 'git@git.rustybever.be:vieter/vc.git'
|
||||
- rm -rf vc/v.c vc/v_win.c
|
||||
- ./v -o vc/v.c -os cross cmd/v
|
||||
- ./v -o vc/v_win.c -os windows -cc msvc cmd/v
|
||||
- sed -i "1s/^/#define V_COMMIT_HASH \"$COMMIT_HASH\"\n/" vc/v.c
|
||||
- sed -i "1s/^/#define V_COMMIT_HASH \"$COMMIT_HASH\"\n/" vc/v_win.c
|
||||
# ensure the C files are over 5000 lines long, as a safety measure
|
||||
- '[ $(wc -l < vc/v.c) -gt 5000 ]'
|
||||
- '[ $(wc -l < vc/v_win.c) -gt 5000 ]'
|
||||
- git -C vc add v.c v_win.c
|
||||
- 'git -C vc commit -m "[v:master] $COMMIT_HASH - $COMMIT_MSG"'
|
||||
# in case there are recent commits:
|
||||
- git -C vc pull --rebase origin main
|
||||
- git -C vc push
|
||||
when:
|
||||
event: push
|
||||
|
||||
publish:
|
||||
image: woodpeckerci/plugin-docker-buildx
|
||||
secrets: [ docker_username, docker_password ]
|
||||
settings:
|
||||
repo: chewingbever/vlang
|
||||
tag: latest
|
||||
dockerfile: Dockerfile.builder
|
||||
platforms: [ linux/arm64/v8, linux/amd64 ]
|
||||
# The build can run every time, because we should only push when there's
|
||||
# actual changes
|
||||
when:
|
||||
event: push
|
|
@ -0,0 +1,32 @@
|
|||
matrix:
|
||||
PLATFORM:
|
||||
- 'linux/amd64'
|
||||
- 'linux/arm64'
|
||||
|
||||
platform: ${PLATFORM}
|
||||
branches: ['master']
|
||||
depends_on:
|
||||
- 'vc'
|
||||
|
||||
pipeline:
|
||||
build:
|
||||
image: 'menci/archlinuxarm:base-devel'
|
||||
commands:
|
||||
# Update packages
|
||||
- pacman -Syu --noconfirm
|
||||
# Create non-root user to perform build & switch to their home
|
||||
- groupadd -g 1000 builder
|
||||
- useradd -mg builder builder
|
||||
- chown -R builder:builder "$PWD"
|
||||
- "echo 'builder ALL=(ALL) NOPASSWD: ALL' >> /etc/sudoers"
|
||||
- su builder
|
||||
# Build the package
|
||||
- makepkg -s --noconfirm --needed
|
||||
|
||||
publish:
|
||||
image: 'curlimages/curl'
|
||||
secrets:
|
||||
- 'vieter_api_key'
|
||||
commands:
|
||||
# Publish the package
|
||||
- 'for pkg in $(ls -1 *.pkg*); do curl -f -XPOST -T "$pkg" -H "X-API-KEY: $VIETER_API_KEY" https://arch.r8r.be/vieter/publish; done'
|
|
@ -0,0 +1,18 @@
|
|||
platform: 'linux/amd64'
|
||||
branches: ['master']
|
||||
depends_on:
|
||||
- 'vc'
|
||||
|
||||
pipeline:
|
||||
build-publish:
|
||||
image: 'woodpeckerci/plugin-docker-buildx'
|
||||
secrets: [ docker_username, docker_password ]
|
||||
settings:
|
||||
repo: chewingbever/vlang
|
||||
tag: latest
|
||||
dockerfile: Dockerfile.builder
|
||||
platforms: [ linux/arm64/v8, linux/amd64 ]
|
||||
# The build can run every time, because we should only push when there's
|
||||
# actual changes
|
||||
when:
|
||||
event: push
|
|
@ -0,0 +1,48 @@
|
|||
platform: 'linux/amd64'
|
||||
branches: ['master']
|
||||
|
||||
pipeline:
|
||||
gen-vc:
|
||||
# This is what the official CI uses as well
|
||||
image: 'ubuntu:latest'
|
||||
secrets:
|
||||
- deploy_key
|
||||
commands:
|
||||
# Install necessary dependencies
|
||||
- apt-get update -y && apt-get install openssh-client git build-essential -y
|
||||
# Build the compiler
|
||||
- make
|
||||
# Run ssh-agent
|
||||
- eval $(ssh-agent -s)
|
||||
# Add ssh key
|
||||
- echo "$DEPLOY_KEY" | tr -d '\r' | ssh-add -
|
||||
# Create ssh dir with proper permissions
|
||||
- mkdir -p ~/.ssh
|
||||
- chmod 700 ~/.ssh
|
||||
# Configure git credentials
|
||||
- git config --global user.email 'vbot@rustybever.be'
|
||||
- git config --global user.name 'vbot'
|
||||
# Verify SSH keys
|
||||
- ssh-keyscan git.rustybever.be > ~/.ssh/known_hosts
|
||||
|
||||
# The following is copied over from the official repo's CI
|
||||
# https://github.com/vlang/v/blob/master/.github/workflows/gen_vc.yml
|
||||
- export "COMMIT_HASH=$(git rev-parse --short HEAD)"
|
||||
- export "COMMIT_MSG=$(git log -1 --oneline --pretty='%s' HEAD)"
|
||||
- rm -rf vc
|
||||
- git clone --depth=1 'git@git.rustybever.be:vieter-v/vc.git'
|
||||
- rm -rf vc/v.c vc/v_win.c
|
||||
- ./v -o vc/v.c -os cross cmd/v
|
||||
- ./v -o vc/v_win.c -os windows -cc msvc cmd/v
|
||||
- sed -i "1s/^/#define V_COMMIT_HASH \"$COMMIT_HASH\"\n/" vc/v.c
|
||||
- sed -i "1s/^/#define V_COMMIT_HASH \"$COMMIT_HASH\"\n/" vc/v_win.c
|
||||
# ensure the C files are over 5000 lines long, as a safety measure
|
||||
- '[ $(wc -l < vc/v.c) -gt 5000 ]'
|
||||
- '[ $(wc -l < vc/v_win.c) -gt 5000 ]'
|
||||
- git -C vc add v.c v_win.c
|
||||
- 'git -C vc commit -m "[v:master] $COMMIT_HASH - $COMMIT_MSG"'
|
||||
# in case there are recent commits:
|
||||
- git -C vc pull --rebase origin main
|
||||
- git -C vc push
|
||||
when:
|
||||
event: push
|
|
@ -2,6 +2,7 @@
|
|||
-*Not yet released, changelog is not full*
|
||||
- Introduce `isize` and `usize` types, deprecate `size_t` in favor of `usize`.
|
||||
- Add `datatypes` and `datatypes.fsm` modules.
|
||||
- Add `compile_error` and `compile_warn` comptime functions.
|
||||
|
||||
-## V 0.2.4
|
||||
-*Not yet released, changelog is not full*
|
||||
|
|
|
@ -35,32 +35,35 @@ The main files are:
|
|||
- Creates a parser object for each file and runs `parse()` on them.
|
||||
- The correct backend is called (C, JS, native), and a binary is compiled.
|
||||
|
||||
2. `v/scanner` The scanner's job is to parse a list of characters and convert
|
||||
2. `vlib/v/scanner` The scanner's job is to parse a list of characters and convert
|
||||
them to tokens.
|
||||
|
||||
3. `v/token` This is simply a list of all tokens, their string values, and a
|
||||
3. `vlib/v/token` This is simply a list of all tokens, their string values, and a
|
||||
couple of helper functions.
|
||||
|
||||
4. `v/parser` The parser. It converts a list of tokens into an AST.
|
||||
4. `vlib/v/parser` The parser. It converts a list of tokens into an AST.
|
||||
In V, objects can be used before declaration, so unknown types are marked as
|
||||
unresolved. They are resolved later in the type checker.
|
||||
|
||||
5. `v/table` V creates one table object that is shared by all parsers. It
|
||||
5. `vlib/v/table` V creates one table object that is shared by all parsers. It
|
||||
contains all types, consts, and functions, as well as several helpers to search
|
||||
for objects by name, register new objects, modify types' fields, etc.
|
||||
|
||||
6. `v/checker` Type checker and resolver. It processes the AST and makes sure
|
||||
6. `vlib/v/checker` Type checker and resolver. It processes the AST and makes sure
|
||||
the types are correct. Unresolved types are resolved, type information is added
|
||||
to the AST.
|
||||
|
||||
7. `v/gen/c` C backend. It simply walks the AST and generates C code that can be
|
||||
7. `vlib/v/gen/c` C backend. It simply walks the AST and generates C code that can be
|
||||
compiled with Clang, GCC, Visual Studio, and TCC.
|
||||
|
||||
8. `json.v` defines the json code generation. This file will be removed once V
|
||||
8. `vlib/v/gen/js` JavaScript backend. It simply walks the AST and generates JS code that can be
|
||||
executed on the browser or in NodeJS/Deno.
|
||||
|
||||
9. `vlib/v/gen/c/json.v` defines the json code generation. This file will be removed once V
|
||||
supports comptime code generation, and it will be possible to do this using the
|
||||
language's tools.
|
||||
|
||||
9. `v/gen/native` is the directory with all the machine code generation logic. It
|
||||
10. `vlib/v/gen/native` is the directory with all the machine code generation logic. It
|
||||
defines a set of functions that translate assembly instructions to machine code
|
||||
and build the binary from scratch byte by byte. It manually builds all headers,
|
||||
segments, sections, symtable, relocations, etc. Right now it only has basic
|
||||
|
@ -90,6 +93,7 @@ making pullrequests, and you can just do normal git operations such as:
|
|||
|
||||
5. When finished with a feature/bugfix/change, you can:
|
||||
`git checkout -b fix_alabala`
|
||||
- Don't forget to keep formatting standards, run `v fmt -w YOUR_MODIFIED_FILES` before committing
|
||||
6. `git push pullrequest` # (NOTE: the `pullrequest` remote was setup on step 4)
|
||||
7. On GitHub's web interface, go to: https://github.com/vlang/v/pulls
|
||||
|
||||
|
@ -187,7 +191,6 @@ to create a copy of the compiler rather than replacing it with `v self`.
|
|||
| `debug_codegen` | Prints automatically generated V code during the scanning phase |
|
||||
| `debug_interface_table` | Prints generated interfaces during C generation |
|
||||
| `debug_interface_type_implements` | Prints debug information when checking that a type implements in interface |
|
||||
| `debug_embed_file_in_prod` | Prints debug information about the embedded files with `$embed_file('somefile')` |
|
||||
| `print_vweb_template_expansions` | Prints vweb compiled HTML files |
|
||||
| `time_checking` | Prints the time spent checking files and other related information |
|
||||
| `time_parsing` | Prints the time spent parsing files and other related information |
|
||||
|
@ -200,3 +203,4 @@ to create a copy of the compiler rather than replacing it with `v self`.
|
|||
| `trace_thirdparty_obj_files` | Prints details about built thirdparty obj files |
|
||||
| `trace_usecache` | Prints details when -usecache is used |
|
||||
| `trace_embed_file` | Prints details when $embed_file is used |
|
||||
| `embed_only_metadata` | Embed only the metadata for the file(s) with `$embed_file('somefile')`; faster; for development, *not* distribution |
|
||||
|
|
|
@ -0,0 +1,33 @@
|
|||
FROM alpine:3.16
|
||||
|
||||
ARG TARGETPLATFORM
|
||||
|
||||
WORKDIR /opt/vlang
|
||||
|
||||
ENV VVV /opt/vlang
|
||||
ENV PATH /opt/vlang:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
|
||||
ENV VFLAGS -cc gcc -gc none
|
||||
ENV V_PATH /opt/vlang/v
|
||||
|
||||
RUN ln -s /opt/vlang/v /usr/bin/v && \
|
||||
apk --no-cache add \
|
||||
git make gcc curl openssl \
|
||||
musl-dev \
|
||||
openssl-libs-static openssl-dev \
|
||||
zlib-static bzip2-static xz-dev expat-static zstd-static lz4-static \
|
||||
sqlite-static sqlite-dev \
|
||||
libx11-dev glfw-dev freetype-dev \
|
||||
libarchive-static libarchive-dev \
|
||||
diffutils \
|
||||
mandoc
|
||||
|
||||
RUN git clone https://git.rustybever.be/vieter/v /opt/vlang && \
|
||||
make && \
|
||||
v -version
|
||||
|
||||
RUN if [ "$TARGETPLATFORM" = 'linux/amd64' ]; then \
|
||||
wget -O /usr/local/bin/mc https://dl.min.io/client/mc/release/linux-amd64/mc && \
|
||||
chmod +x /usr/local/bin/mc ; \
|
||||
fi
|
||||
|
||||
CMD ["v"]
|
|
@ -1,10 +1,10 @@
|
|||
FROM mstorsjo/llvm-mingw
|
||||
|
||||
LABEL maintainer="Vitaly Takmazov <vitalyster@gmail.com>"
|
||||
LABEL maintainer="Delyan Angelov <delian66@gmail.com>"
|
||||
COPY . .
|
||||
RUN make
|
||||
RUN ./v -os windows -o v.c cmd/v
|
||||
RUN x86_64-w64-mingw32-gcc v.c -std=c99 -I ./thirdparty/stdatomic/win -w -municode -o v.exe
|
||||
RUN x86_64-w64-mingw32-gcc v.c -std=c99 -w -municode -o v.exe
|
||||
RUN file v.exe
|
||||
|
||||
CMD [ "bash" ]
|
||||
|
|
15
GNUmakefile
15
GNUmakefile
|
@ -5,7 +5,7 @@ TMPDIR ?= /tmp
|
|||
VROOT ?= .
|
||||
VC ?= ./vc
|
||||
V ?= ./v
|
||||
VCREPO ?= https://github.com/vlang/vc
|
||||
VCREPO ?= https://git.rustybever.be/vieter/vc
|
||||
TCCREPO ?= https://github.com/vlang/tccbin
|
||||
|
||||
VCFILE := v.c
|
||||
|
@ -28,6 +28,9 @@ endif
|
|||
ifeq ($(_SYS),Linux)
|
||||
LINUX := 1
|
||||
TCCOS := linux
|
||||
ifneq ($(shell ldd /bin/ls | grep musl),)
|
||||
TCCOS := linuxmusl
|
||||
endif
|
||||
endif
|
||||
|
||||
ifeq ($(_SYS),Darwin)
|
||||
|
@ -76,7 +79,7 @@ endif
|
|||
endif
|
||||
endif
|
||||
|
||||
.PHONY: all clean fresh_vc fresh_tcc check_for_working_tcc
|
||||
.PHONY: all clean check fresh_vc fresh_tcc check_for_working_tcc
|
||||
|
||||
ifdef prod
|
||||
VFLAGS+=-prod
|
||||
|
@ -84,13 +87,13 @@ endif
|
|||
|
||||
all: latest_vc latest_tcc
|
||||
ifdef WIN32
|
||||
$(CC) $(CFLAGS) -std=c99 -municode -w -I ./thirdparty/stdatomic/nix -o v1.exe $(VC)/$(VCFILE) $(LDFLAGS)
|
||||
$(CC) $(CFLAGS) -std=c99 -municode -w -o v1.exe $(VC)/$(VCFILE) $(LDFLAGS)
|
||||
v1.exe -no-parallel -o v2.exe $(VFLAGS) cmd/v
|
||||
v2.exe -o $(V) $(VFLAGS) cmd/v
|
||||
del v1.exe
|
||||
del v2.exe
|
||||
else
|
||||
$(CC) $(CFLAGS) -std=gnu99 -w -I ./thirdparty/stdatomic/nix -o v1.exe $(VC)/$(VCFILE) -lm -lpthread $(LDFLAGS)
|
||||
$(CC) $(CFLAGS) -std=gnu99 -w -o v1.exe $(VC)/$(VCFILE) -lm -lpthread $(LDFLAGS)
|
||||
./v1.exe -no-parallel -o v2.exe $(VFLAGS) cmd/v
|
||||
./v2.exe -o $(V) $(VFLAGS) cmd/v
|
||||
rm -rf v1.exe v2.exe
|
||||
|
@ -113,7 +116,7 @@ endif
|
|||
|
||||
check_for_working_tcc:
|
||||
@$(TMPTCC)/tcc.exe --version > /dev/null 2> /dev/null || echo "The executable '$(TMPTCC)/tcc.exe' does not work."
|
||||
|
||||
|
||||
fresh_vc:
|
||||
rm -rf $(VC)
|
||||
$(GITFASTCLONE) $(VCREPO) $(VC)
|
||||
|
@ -164,3 +167,5 @@ selfcompile-static:
|
|||
install:
|
||||
@echo 'Please use `sudo ./v symlink` instead.'
|
||||
|
||||
check:
|
||||
$(V) test-all
|
||||
|
|
9
Makefile
9
Makefile
|
@ -3,12 +3,17 @@ VFLAGS ?=
|
|||
CFLAGS ?=
|
||||
LDFLAGS ?=
|
||||
|
||||
.PHONY: all check
|
||||
|
||||
all:
|
||||
rm -rf vc/
|
||||
git clone --depth 1 --quiet https://github.com/vlang/vc
|
||||
$(CC) $(CFLAGS) -std=gnu11 -w -I ./thirdparty/stdatomic/nix -o v1 vc/v.c -lm -lexecinfo -lpthread $(LDFLAGS)
|
||||
git clone --depth 1 --quiet https://git.rustybever.be/vieter/vc
|
||||
$(CC) $(CFLAGS) -std=gnu11 -w -o v1 vc/v.c -lm -lexecinfo -lpthread $(LDFLAGS)
|
||||
./v1 -no-parallel -o v2 $(VFLAGS) cmd/v
|
||||
./v2 -o v $(VFLAGS) cmd/v
|
||||
rm -rf v1 v2 vc/
|
||||
@echo "V has been successfully built"
|
||||
./v run ./cmd/tools/detect_tcc.v
|
||||
|
||||
check:
|
||||
./v test-all
|
||||
|
|
|
@ -0,0 +1,54 @@
|
|||
# Maintainer: Jef Roosens
|
||||
# This PKGBUILD is mostly copied over from the AUR
|
||||
# https://aur.archlinux.org/packages/vlang-git
|
||||
|
||||
pkgname=vieter-v
|
||||
pkgver=0.2.2.r796.gfbc02cbc5
|
||||
pkgrel=1
|
||||
pkgdesc='Simple, fast, safe, compiled language for developing maintainable software'
|
||||
arch=('x86_64' 'aarch64')
|
||||
url='https://vlang.io'
|
||||
license=('MIT')
|
||||
depends=('glibc')
|
||||
makedepends=('git')
|
||||
optdepends=('glfw: Needed for graphics support'
|
||||
'freetype2: Needed for graphics support'
|
||||
'openssl: Needed for http support')
|
||||
provides=('vlang')
|
||||
conflicts=('v' 'vlang' 'vlang-bin')
|
||||
source=('vlang::git+https://git.rustybever.be/Chewing_Bever/v')
|
||||
sha256sums=('SKIP')
|
||||
|
||||
pkgver() {
|
||||
cd "${srcdir}/vlang"
|
||||
# Weekly tags are considered older than semantic tags that are older than
|
||||
# them, so to prevent version resolution problems we exclude weekly tags.
|
||||
git describe --long --tags --exclude "weekly*" | sed 's/^v//;s/\([^-]*-g\)/r\1/;s/-/./g'
|
||||
}
|
||||
|
||||
build() {
|
||||
cd "${srcdir}/vlang"
|
||||
# We don't require optimizations when compiling the bootstrap executable and
|
||||
# -O2 actually breaks `./v self` (resulting in "cgen error:"), so we empty
|
||||
# CFLAGS and LDFLAGS to ensure successful compilation.
|
||||
CFLAGS="" LDFLAGS="" prod=1 make
|
||||
|
||||
# vpm and vdoc fail to compile with "unsupported linker option" when LDFLAGS
|
||||
# is set
|
||||
LDFLAGS="" ./v build-tools
|
||||
}
|
||||
|
||||
package() {
|
||||
cd "${srcdir}/vlang"
|
||||
install -d "$pkgdir/usr/lib/vlang" "$pkgdir/usr/share/vlang" "$pkgdir/usr/bin"
|
||||
install -Dm644 LICENSE "$pkgdir/usr/share/licenses/$pkgname/LICENSE"
|
||||
install -Dm755 v "$pkgdir/usr/lib/vlang"
|
||||
cp -a cmd "$pkgdir/usr/lib/vlang/"
|
||||
cp -a examples "$pkgdir/usr/share/vlang/"
|
||||
cp -a thirdparty "$pkgdir/usr/lib/vlang/"
|
||||
cp -a vlib "$pkgdir/usr/lib/vlang/"
|
||||
cp v.mod "$pkgdir/usr/lib/vlang/"
|
||||
ln -s /usr/lib/vlang/v "$pkgdir/usr/bin/v"
|
||||
|
||||
touch "$pkgdir/usr/lib/vlang/cmd/tools/.disable_autorecompilation"
|
||||
}
|
14
README.md
14
README.md
|
@ -4,11 +4,7 @@
|
|||
</p>
|
||||
<h1>The V Programming Language</h1>
|
||||
|
||||
[vlang.io](https://vlang.io) |
|
||||
[Docs](https://github.com/vlang/v/blob/master/doc/docs.md) |
|
||||
[Changelog](https://github.com/vlang/v/blob/master/CHANGELOG.md) |
|
||||
[Speed](https://fast.vlang.io/) |
|
||||
[Contributing & compiler design](https://github.com/vlang/v/blob/master/CONTRIBUTING.md)
|
||||
[vlang.io](https://vlang.io) | [Docs](https://github.com/vlang/v/blob/master/doc/docs.md) | [Changelog](https://github.com/vlang/v/blob/master/CHANGELOG.md) | [Speed](https://fast.vlang.io/) | [Contributing & compiler design](https://github.com/vlang/v/blob/master/CONTRIBUTING.md)
|
||||
|
||||
</div>
|
||||
<div align="center">
|
||||
|
@ -31,7 +27,7 @@
|
|||
- Easy to develop: V compiles itself in less than a second
|
||||
- Performance: as fast as C (V's main backend compiles to human-readable C)
|
||||
- Safety: no null, no globals, no undefined behavior, immutability by default
|
||||
- C to V translation
|
||||
- C to V translation ([Translating DOOM demo video](https://www.youtube.com/watch?v=6oXrz3oRoEg))
|
||||
- Hot code reloading
|
||||
- [Innovative memory management](https://vlang.io/#memory) ([demo video](https://www.youtube.com/watch?v=gmB8ea8uLsM))
|
||||
- [Cross-platform UI library](https://github.com/vlang/ui)
|
||||
|
@ -64,11 +60,11 @@ language, very similar to the way it is right now.
|
|||
|
||||
### Linux, macOS, Windows, *BSD, Solaris, WSL, Android, etc.
|
||||
|
||||
Usually installing V is quite simple if you have an environment that already has a
|
||||
functional `git` installation.
|
||||
Usually installing V is quite simple if you have an environment that already has a
|
||||
functional `git` installation.
|
||||
|
||||
* *(* ***PLEASE NOTE:*** *If you run into any trouble or you have a different operating
|
||||
system or Linux distribution that doesn't install or work immediately, please see
|
||||
system or Linux distribution that doesn't install or work immediately, please see
|
||||
[Installation Issues](https://github.com/vlang/v/discussions/categories/installation-issues)
|
||||
and search for your OS and problem. If you can't find your problem, please add it to an
|
||||
existing discussion if one exists for your OS, or create a new one if a main discussion
|
||||
|
|
14
TESTS.md
14
TESTS.md
|
@ -83,6 +83,20 @@ This *test runner*, checks whether whole project folders, can be compiled, and r
|
|||
NB: Each project in these folders, should finish with an exit code of 0,
|
||||
and it should output `OK` as its last stdout line.
|
||||
|
||||
## `v vlib/v/tests/known_errors/known_errors_test.v`
|
||||
This *test runner*, checks whether a known program, that was expected to compile,
|
||||
but did NOT, due to a buggy checker, parser or cgen, continues to fail.
|
||||
The negative programs are collected in the `vlib/v/tests/known_errors/testdata/` folder.
|
||||
Each of them should FAIL to compile, due to a known/confirmed compiler bug/limitation.
|
||||
|
||||
The intended use of this, is for providing samples, that currently do NOT compile,
|
||||
but that a future compiler improvement WILL be able to compile, and to
|
||||
track, whether they were not fixed incidentally, due to an unrelated
|
||||
change/improvement. For example, code that triggers generating invalid C code can go here,
|
||||
and later when a bug is fixed, can be moved to a proper _test.v or .vv/.out pair, outside of
|
||||
the `vlib/v/tests/known_errors/testdata/` folder.
|
||||
|
||||
|
||||
## Test building of actual V programs (examples, tools, V itself)
|
||||
|
||||
* `v build-tools`
|
||||
|
|
|
@ -14,9 +14,9 @@ fn main() {
|
|||
mut checksum := u64(0)
|
||||
mut start_pos := 0
|
||||
mut bgenerating := benchmark.start()
|
||||
mut bytepile := []byte{}
|
||||
mut bytepile := []u8{}
|
||||
for _ in 0 .. sample_size * max_str_len {
|
||||
bytepile << byte(rand.int_in_range(40, 125) or { 40 })
|
||||
bytepile << u8(rand.int_in_range(40, 125) or { 40 })
|
||||
}
|
||||
mut str_lens := []int{}
|
||||
for _ in 0 .. sample_size {
|
||||
|
@ -30,7 +30,7 @@ fn main() {
|
|||
checksum = 0
|
||||
for len in str_lens {
|
||||
end_pos := start_pos + len
|
||||
checksum ^= wyhash.wyhash_c(unsafe { &byte(bytepile.data) + start_pos }, u64(len),
|
||||
checksum ^= wyhash.wyhash_c(unsafe { &u8(bytepile.data) + start_pos }, u64(len),
|
||||
1)
|
||||
start_pos = end_pos
|
||||
}
|
||||
|
|
|
@ -43,7 +43,7 @@ fn main() {
|
|||
vexe := pref.vexe_path()
|
||||
vroot := os.dir(vexe)
|
||||
util.set_vroot_folder(vroot)
|
||||
os.chdir(vroot) ?
|
||||
os.chdir(vroot)?
|
||||
cmd := diff.find_working_diff_command() or { '' }
|
||||
mut app := App{
|
||||
diff_cmd: cmd
|
||||
|
|
|
@ -4,7 +4,8 @@ fn main() {
|
|||
exit(0)
|
||||
}
|
||||
|
||||
println('
|
||||
$if !macos {
|
||||
println('
|
||||
Note: `tcc` was not used, so unless you install it yourself, your backend
|
||||
C compiler will be `cc`, which is usually either `clang`, `gcc` or `msvc`.
|
||||
|
||||
|
@ -12,4 +13,5 @@ These C compilers, are several times slower at compiling C source code,
|
|||
compared to `tcc`. They do produce more optimised executables, but that
|
||||
is done at the cost of compilation speed.
|
||||
')
|
||||
}
|
||||
}
|
||||
|
|
|
@ -15,7 +15,7 @@ const vdir = @VEXEROOT
|
|||
fn main() {
|
||||
dump(fast_dir)
|
||||
dump(vdir)
|
||||
os.chdir(fast_dir) ?
|
||||
os.chdir(fast_dir)?
|
||||
if !os.exists('$vdir/v') && !os.is_dir('$vdir/vlib') {
|
||||
println('fast.html generator needs to be located in `v/cmd/tools/fast`')
|
||||
}
|
||||
|
@ -28,37 +28,37 @@ fn main() {
|
|||
return
|
||||
}
|
||||
}
|
||||
// Fetch the last commit's hash
|
||||
|
||||
// fetch the last commit's hash
|
||||
commit := exec('git rev-parse HEAD')[..8]
|
||||
if !os.exists('table.html') {
|
||||
os.create('table.html') ?
|
||||
os.create('table.html')?
|
||||
}
|
||||
mut table := os.read_file('table.html') ?
|
||||
mut table := os.read_file('table.html')?
|
||||
if os.exists('website/index.html') {
|
||||
uploaded_index := os.read_file('website/index.html') ?
|
||||
uploaded_index := os.read_file('website/index.html')?
|
||||
if uploaded_index.contains('>$commit<') {
|
||||
println('nothing to benchmark')
|
||||
exit(1)
|
||||
return
|
||||
}
|
||||
}
|
||||
// for i, commit in commits {
|
||||
message := exec('git log --pretty=format:"%s" -n1 $commit')
|
||||
// println('\n${i + 1}/$commits.len Benchmarking commit $commit "$message"')
|
||||
println('\nBenchmarking commit $commit "$message"')
|
||||
// Build an optimized V
|
||||
// println('Checking out ${commit}...')
|
||||
// exec('git checkout $commit')
|
||||
|
||||
// build an optimized V
|
||||
println(' Building vprod...')
|
||||
os.chdir(vdir) ?
|
||||
os.chdir(vdir)?
|
||||
if os.args.contains('-noprod') {
|
||||
exec('./v -o vprod cmd/v') // for faster debugging
|
||||
} else {
|
||||
exec('./v -o vprod -prod -prealloc cmd/v')
|
||||
}
|
||||
|
||||
// cache vlib modules
|
||||
exec('$vdir/v wipe-cache')
|
||||
exec('$vdir/v -o v2 -prod cmd/v')
|
||||
|
||||
// measure
|
||||
diff1 := measure('$vdir/vprod $voptions -o v.c cmd/v', 'v.c')
|
||||
mut tcc_path := 'tcc'
|
||||
|
@ -71,23 +71,24 @@ fn main() {
|
|||
if os.args.contains('-clang') {
|
||||
tcc_path = 'clang'
|
||||
}
|
||||
|
||||
diff2 := measure('$vdir/vprod $voptions -cc $tcc_path -o v2 cmd/v', 'v2')
|
||||
diff3 := 0 // measure('$vdir/vprod -native $vdir/cmd/tools/1mil.v', 'native 1mil')
|
||||
diff4 := measure('$vdir/vprod -usecache $voptions -cc clang examples/hello_world.v',
|
||||
'hello.v')
|
||||
vc_size := os.file_size('v.c') / 1000
|
||||
// scan/parse/check/cgen
|
||||
scan, parse, check, cgen, vlines := measure_steps(vdir)
|
||||
// println('Building V took ${diff}ms')
|
||||
|
||||
commit_date := exec('git log -n1 --pretty="format:%at" $commit')
|
||||
date := time.unix(commit_date.int())
|
||||
//
|
||||
os.chdir(fast_dir) ?
|
||||
mut out := os.create('table.html') ?
|
||||
// Place the new row on top
|
||||
|
||||
os.chdir(fast_dir)?
|
||||
mut out := os.create('table.html')?
|
||||
|
||||
// place the new row on top
|
||||
html_message := message.replace_each(['<', '<', '>', '>'])
|
||||
table =
|
||||
'<tr>
|
||||
' <tr>
|
||||
<td>$date.format()</td>
|
||||
<td><a target=_blank href="https://github.com/vlang/v/commit/$commit">$commit</a></td>
|
||||
<td>$html_message</td>
|
||||
|
@ -104,26 +105,25 @@ fn main() {
|
|||
<td>${int(f64(vlines) / f64(diff1) * 1000.0)}</td>
|
||||
</tr>\n' +
|
||||
table.trim_space()
|
||||
out.writeln(table) ?
|
||||
out.writeln(table)?
|
||||
out.close()
|
||||
// Regenerate index.html
|
||||
header := os.read_file('header.html') ?
|
||||
footer := os.read_file('footer.html') ?
|
||||
mut res := os.create('index.html') ?
|
||||
res.writeln(header) ?
|
||||
res.writeln(table) ?
|
||||
res.writeln(footer) ?
|
||||
|
||||
// regenerate index.html
|
||||
header := os.read_file('header.html')?
|
||||
footer := os.read_file('footer.html')?
|
||||
mut res := os.create('index.html')?
|
||||
res.writeln(header)?
|
||||
res.writeln(table)?
|
||||
res.writeln(footer)?
|
||||
res.close()
|
||||
//}
|
||||
// exec('git checkout master')
|
||||
// os.write_file('last_commit.txt', commits[commits.len - 1]) ?
|
||||
// Upload the result to github pages
|
||||
|
||||
// upload the result to github pages
|
||||
if os.args.contains('-upload') {
|
||||
println('uploading...')
|
||||
os.chdir('website') ?
|
||||
os.chdir('website')?
|
||||
os.execute_or_exit('git checkout gh-pages')
|
||||
os.cp('../index.html', 'index.html') ?
|
||||
os.rm('../index.html') ?
|
||||
os.cp('../index.html', 'index.html')?
|
||||
os.rm('../index.html')?
|
||||
os.system('git commit -am "update benchmark"')
|
||||
os.system('git push origin gh-pages')
|
||||
}
|
||||
|
@ -134,7 +134,7 @@ fn exec(s string) string {
|
|||
return e.output.trim_right('\r\n')
|
||||
}
|
||||
|
||||
// returns milliseconds
|
||||
// measure returns milliseconds
|
||||
fn measure(cmd string, description string) int {
|
||||
println(' Measuring $description')
|
||||
println(' Warming up...')
|
||||
|
@ -186,7 +186,7 @@ fn measure_steps(vdir string) (int, int, int, int, int) {
|
|||
cgen = line[0].int()
|
||||
}
|
||||
} else {
|
||||
// Fetch number of V lines
|
||||
// fetch number of V lines
|
||||
if line[0].contains('V') && line[0].contains('source') && line[0].contains('size') {
|
||||
start := line[0].index(':') or { 0 }
|
||||
end := line[0].index('lines,') or { 0 }
|
||||
|
|
|
@ -0,0 +1,67 @@
|
|||
import os
|
||||
import time
|
||||
import v.ast
|
||||
import v.pref
|
||||
import v.parser
|
||||
import v.errors
|
||||
import v.scanner
|
||||
|
||||
fn main() {
|
||||
files := os.args#[1..]
|
||||
if files.len > 0 && files[0].starts_with('@') {
|
||||
lst_path := files[0].all_after('@')
|
||||
listed_files := os.read_file(lst_path)?.split('\n')
|
||||
process_files(listed_files)?
|
||||
return
|
||||
}
|
||||
process_files(files)?
|
||||
}
|
||||
|
||||
fn process_files(files []string) ? {
|
||||
mut table := ast.new_table()
|
||||
mut pref := pref.new_preferences()
|
||||
pref.is_fmt = true
|
||||
pref.skip_warnings = true
|
||||
pref.output_mode = .silent
|
||||
mut sw := time.new_stopwatch()
|
||||
mut total_us := i64(0)
|
||||
mut total_bytes := i64(0)
|
||||
mut total_tokens := i64(0)
|
||||
for f in files {
|
||||
if f == '' {
|
||||
continue
|
||||
}
|
||||
if f.ends_with('_test.v') {
|
||||
continue
|
||||
}
|
||||
// do not measure the scanning, but only the parsing:
|
||||
mut p := new_parser(f, .skip_comments, table, pref)
|
||||
///
|
||||
sw.restart()
|
||||
_ := p.parse()
|
||||
f_us := sw.elapsed().microseconds()
|
||||
///
|
||||
total_us += f_us
|
||||
total_bytes += p.scanner.text.len
|
||||
total_tokens += p.scanner.all_tokens.len
|
||||
println('${f_us:10}us ${p.scanner.all_tokens.len:10} ${p.scanner.text.len:10} ${(f64(p.scanner.text.len) / p.scanner.all_tokens.len):7.3} ${p.errors.len:4} $f')
|
||||
}
|
||||
println('${total_us:10}us ${total_tokens:10} ${total_bytes:10} ${(f64(total_tokens) / total_bytes):7.3} | speed: ${(f64(total_bytes) / total_us):2.5f} MB/s')
|
||||
}
|
||||
|
||||
fn new_parser(path string, comments_mode scanner.CommentsMode, table &ast.Table, pref &pref.Preferences) &parser.Parser {
|
||||
mut p := &parser.Parser{
|
||||
scanner: scanner.new_scanner_file(path, comments_mode, pref) or { panic(err) }
|
||||
comments_mode: comments_mode
|
||||
table: table
|
||||
pref: pref
|
||||
scope: &ast.Scope{
|
||||
start_pos: 0
|
||||
parent: table.global_scope
|
||||
}
|
||||
errors: []errors.Error{}
|
||||
warnings: []errors.Warning{}
|
||||
}
|
||||
p.set_path(path)
|
||||
return p
|
||||
}
|
|
@ -0,0 +1,42 @@
|
|||
import os
|
||||
import time
|
||||
import v.scanner
|
||||
import v.pref
|
||||
|
||||
fn main() {
|
||||
files := os.args#[1..]
|
||||
if files.len > 0 && files[0].starts_with('@') {
|
||||
lst_path := files[0].all_after('@')
|
||||
listed_files := os.read_file(lst_path)?.split('\n')
|
||||
process_files(listed_files)?
|
||||
return
|
||||
}
|
||||
process_files(files)?
|
||||
}
|
||||
|
||||
fn process_files(files []string) ? {
|
||||
mut pref := pref.new_preferences()
|
||||
pref.is_fmt = true
|
||||
pref.skip_warnings = true
|
||||
pref.output_mode = .silent
|
||||
mut sw := time.new_stopwatch()
|
||||
mut total_us := i64(0)
|
||||
mut total_bytes := i64(0)
|
||||
mut total_tokens := i64(0)
|
||||
for f in files {
|
||||
if f == '' {
|
||||
continue
|
||||
}
|
||||
if f.ends_with('_test.v') {
|
||||
continue
|
||||
}
|
||||
sw.restart()
|
||||
s := scanner.new_scanner_file(f, .skip_comments, pref)?
|
||||
f_us := sw.elapsed().microseconds()
|
||||
total_us += f_us
|
||||
total_bytes += s.text.len
|
||||
total_tokens += s.all_tokens.len
|
||||
println('${f_us:10}us ${s.all_tokens.len:10} ${s.text.len:10} ${(f64(s.text.len) / s.all_tokens.len):7.3f} $f')
|
||||
}
|
||||
println('${total_us:10}us ${total_tokens:10} ${total_bytes:10} ${(f64(total_tokens) / total_bytes):7.3f} | speed: ${(f64(total_bytes) / total_us):2.5f} MB/s')
|
||||
}
|
|
@ -1,172 +0,0 @@
|
|||
// Copyright (c) 2020 Lars Pontoppidan. All rights reserved.
|
||||
// Use of this source code is governed by an MIT license
|
||||
// that can be found in the LICENSE file.
|
||||
import os
|
||||
import flag
|
||||
|
||||
const (
|
||||
tool_name = os.file_name(os.executable())
|
||||
tool_version = '0.0.3'
|
||||
tool_description = 'Prints all V functions in .v files under PATH/, that do not yet have documentation comments.'
|
||||
work_dir_prefix = normalise_path(os.real_path(os.wd_at_startup) + '/')
|
||||
)
|
||||
|
||||
struct UndocumentedFN {
|
||||
line int
|
||||
signature string
|
||||
tags []string
|
||||
}
|
||||
|
||||
struct Options {
|
||||
show_help bool
|
||||
collect_tags bool
|
||||
deprecated bool
|
||||
private bool
|
||||
js bool
|
||||
no_line_numbers bool
|
||||
exclude []string
|
||||
relative_paths bool
|
||||
}
|
||||
|
||||
fn (opt Options) report_undocumented_functions_in_path(path string) {
|
||||
mut files := []string{}
|
||||
collect(path, mut files, fn (npath string, mut accumulated_paths []string) {
|
||||
if !npath.ends_with('.v') {
|
||||
return
|
||||
}
|
||||
if npath.ends_with('_test.v') {
|
||||
return
|
||||
}
|
||||
accumulated_paths << npath
|
||||
})
|
||||
for file in files {
|
||||
if !opt.js && file.ends_with('.js.v') {
|
||||
continue
|
||||
}
|
||||
if opt.exclude.len > 0 && opt.exclude.any(file.contains(it)) {
|
||||
continue
|
||||
}
|
||||
opt.report_undocumented_functions_in_file(file)
|
||||
}
|
||||
}
|
||||
|
||||
fn (opt &Options) report_undocumented_functions_in_file(nfile string) {
|
||||
file := os.real_path(nfile)
|
||||
contents := os.read_file(file) or { panic(err) }
|
||||
lines := contents.split('\n')
|
||||
mut info := []UndocumentedFN{}
|
||||
for i, line in lines {
|
||||
if line.starts_with('pub fn') || (opt.private && (line.starts_with('fn ')
|
||||
&& !(line.starts_with('fn C.') || line.starts_with('fn main')))) {
|
||||
// println('Match: $line')
|
||||
if i > 0 && lines.len > 0 {
|
||||
mut line_above := lines[i - 1]
|
||||
if !line_above.starts_with('//') {
|
||||
mut tags := []string{}
|
||||
mut grab := true
|
||||
for j := i - 1; j >= 0; j-- {
|
||||
prev_line := lines[j]
|
||||
if prev_line.contains('}') { // We've looked back to the above scope, stop here
|
||||
break
|
||||
} else if prev_line.starts_with('[') {
|
||||
tags << collect_tags(prev_line)
|
||||
continue
|
||||
} else if prev_line.starts_with('//') { // Single-line comment
|
||||
grab = false
|
||||
break
|
||||
}
|
||||
}
|
||||
if grab {
|
||||
clean_line := line.all_before_last(' {')
|
||||
info << UndocumentedFN{i + 1, clean_line, tags}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
if info.len > 0 {
|
||||
for undocumented_fn in info {
|
||||
mut line_numbers := '$undocumented_fn.line:0:'
|
||||
if opt.no_line_numbers {
|
||||
line_numbers = ''
|
||||
}
|
||||
tags_str := if opt.collect_tags && undocumented_fn.tags.len > 0 {
|
||||
'$undocumented_fn.tags'
|
||||
} else {
|
||||
''
|
||||
}
|
||||
ofile := if opt.relative_paths {
|
||||
nfile.replace(work_dir_prefix, '')
|
||||
} else {
|
||||
os.real_path(nfile)
|
||||
}
|
||||
if opt.deprecated {
|
||||
println('$ofile:$line_numbers$undocumented_fn.signature $tags_str')
|
||||
} else {
|
||||
if 'deprecated' !in undocumented_fn.tags {
|
||||
println('$ofile:$line_numbers$undocumented_fn.signature $tags_str')
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn normalise_path(path string) string {
|
||||
return path.replace('\\', '/')
|
||||
}
|
||||
|
||||
fn collect(path string, mut l []string, f fn (string, mut []string)) {
|
||||
if !os.is_dir(path) {
|
||||
return
|
||||
}
|
||||
mut files := os.ls(path) or { return }
|
||||
for file in files {
|
||||
p := normalise_path(os.join_path_single(path, file))
|
||||
if os.is_dir(p) && !os.is_link(p) {
|
||||
collect(p, mut l, f)
|
||||
} else if os.exists(p) {
|
||||
f(p, mut l)
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
fn collect_tags(line string) []string {
|
||||
mut cleaned := line.all_before('/')
|
||||
cleaned = cleaned.replace_each(['[', '', ']', '', ' ', ''])
|
||||
return cleaned.split(',')
|
||||
}
|
||||
|
||||
fn main() {
|
||||
if os.args.len == 1 {
|
||||
println('Usage: $tool_name PATH \n$tool_description\n$tool_name -h for more help...')
|
||||
exit(1)
|
||||
}
|
||||
mut fp := flag.new_flag_parser(os.args[1..])
|
||||
fp.application(tool_name)
|
||||
fp.version(tool_version)
|
||||
fp.description(tool_description)
|
||||
fp.arguments_description('PATH [PATH]...')
|
||||
// Collect tool options
|
||||
opt := Options{
|
||||
show_help: fp.bool('help', `h`, false, 'Show this help text.')
|
||||
deprecated: fp.bool('deprecated', `d`, false, 'Include deprecated functions in output.')
|
||||
private: fp.bool('private', `p`, false, 'Include private functions in output.')
|
||||
js: fp.bool('js', 0, false, 'Include JavaScript functions in output.')
|
||||
no_line_numbers: fp.bool('no-line-numbers', `n`, false, 'Exclude line numbers in output.')
|
||||
collect_tags: fp.bool('tags', `t`, false, 'Also print function tags if any is found.')
|
||||
exclude: fp.string_multi('exclude', `e`, '')
|
||||
relative_paths: fp.bool('relative-paths', `r`, false, 'Use relative paths in output.')
|
||||
}
|
||||
if opt.show_help {
|
||||
println(fp.usage())
|
||||
exit(0)
|
||||
}
|
||||
for path in os.args[1..] {
|
||||
if os.is_file(path) {
|
||||
opt.report_undocumented_functions_in_file(path)
|
||||
} else {
|
||||
opt.report_undocumented_functions_in_path(path)
|
||||
}
|
||||
}
|
||||
}
|
|
@ -24,6 +24,7 @@ pub fn cprint(omessage string) {
|
|||
message = term.cyan(message)
|
||||
}
|
||||
print(message)
|
||||
flush_stdout()
|
||||
}
|
||||
|
||||
pub fn cprint_strong(omessage string) {
|
||||
|
@ -32,16 +33,19 @@ pub fn cprint_strong(omessage string) {
|
|||
message = term.bright_green(message)
|
||||
}
|
||||
print(message)
|
||||
flush_stdout()
|
||||
}
|
||||
|
||||
pub fn cprintln(omessage string) {
|
||||
cprint(omessage)
|
||||
println('')
|
||||
flush_stdout()
|
||||
}
|
||||
|
||||
pub fn cprintln_strong(omessage string) {
|
||||
cprint_strong(omessage)
|
||||
println('')
|
||||
flush_stdout()
|
||||
}
|
||||
|
||||
pub fn verbose_trace(label string, message string) {
|
||||
|
|
|
@ -19,6 +19,8 @@ pub const hide_oks = os.getenv('VTEST_HIDE_OK') == '1'
|
|||
|
||||
pub const fail_fast = os.getenv('VTEST_FAIL_FAST') == '1'
|
||||
|
||||
pub const fail_flaky = os.getenv('VTEST_FAIL_FLAKY') == '1'
|
||||
|
||||
pub const test_only = os.getenv('VTEST_ONLY').split_any(',')
|
||||
|
||||
pub const test_only_fn = os.getenv('VTEST_ONLY_FN').split_any(',')
|
||||
|
@ -35,7 +37,6 @@ pub mut:
|
|||
vroot string
|
||||
vtmp_dir string
|
||||
vargs string
|
||||
failed bool
|
||||
fail_fast bool
|
||||
benchmark benchmark.Benchmark
|
||||
rm_binaries bool = true
|
||||
|
@ -122,6 +123,7 @@ pub fn (mut ts TestSession) print_messages() {
|
|||
// progress mode, the last line is rewritten many times:
|
||||
if is_ok && !ts.silent_mode {
|
||||
print('\r$empty\r$msg')
|
||||
flush_stdout()
|
||||
} else {
|
||||
// the last \n is needed, so SKIP/FAIL messages
|
||||
// will not get overwritten by the OK ones
|
||||
|
@ -286,7 +288,7 @@ pub fn (mut ts TestSession) test() {
|
|||
fn worker_trunner(mut p pool.PoolProcessor, idx int, thread_id int) voidptr {
|
||||
mut ts := &TestSession(p.get_shared_context())
|
||||
if ts.fail_fast {
|
||||
if ts.failed {
|
||||
if ts.failed_cmds.len > 0 {
|
||||
return pool.no_result
|
||||
}
|
||||
}
|
||||
|
@ -306,12 +308,14 @@ fn worker_trunner(mut p pool.PoolProcessor, idx int, thread_id int) voidptr {
|
|||
mut run_js := false
|
||||
|
||||
is_fmt := ts.vargs.contains('fmt')
|
||||
is_vet := ts.vargs.contains('vet')
|
||||
produces_file_output := !(is_fmt || is_vet)
|
||||
|
||||
if relative_file.ends_with('js.v') {
|
||||
if !is_fmt {
|
||||
if produces_file_output {
|
||||
cmd_options << ' -b js'
|
||||
run_js = true
|
||||
}
|
||||
run_js = true
|
||||
}
|
||||
|
||||
if relative_file.contains('global') && !is_fmt {
|
||||
|
@ -333,13 +337,13 @@ fn worker_trunner(mut p pool.PoolProcessor, idx int, thread_id int) voidptr {
|
|||
fname.replace('.v', '')
|
||||
}
|
||||
generated_binary_fpath := os.join_path_single(tmpd, generated_binary_fname)
|
||||
if os.exists(generated_binary_fpath) {
|
||||
if ts.rm_binaries {
|
||||
os.rm(generated_binary_fpath) or {}
|
||||
if produces_file_output {
|
||||
if os.exists(generated_binary_fpath) {
|
||||
if ts.rm_binaries {
|
||||
os.rm(generated_binary_fpath) or {}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if !ts.vargs.contains('fmt') {
|
||||
cmd_options << ' -o ${os.quoted_path(generated_binary_fpath)}'
|
||||
}
|
||||
cmd := '${os.quoted_path(ts.vexe)} ' + cmd_options.join(' ') + ' ${os.quoted_path(file)}'
|
||||
|
@ -360,7 +364,7 @@ fn worker_trunner(mut p pool.PoolProcessor, idx int, thread_id int) voidptr {
|
|||
details := get_test_details(file)
|
||||
os.setenv('VTEST_RETRY_MAX', '$details.retry', true)
|
||||
for retry := 1; retry <= details.retry; retry++ {
|
||||
ts.append_message(.info, ' retrying $retry/$details.retry of $relative_file ...')
|
||||
ts.append_message(.info, ' [stats] retrying $retry/$details.retry of $relative_file ; known flaky: $details.flaky ...')
|
||||
os.setenv('VTEST_RETRY', '$retry', true)
|
||||
status = os.system(cmd)
|
||||
if status == 0 {
|
||||
|
@ -370,7 +374,12 @@ fn worker_trunner(mut p pool.PoolProcessor, idx int, thread_id int) voidptr {
|
|||
}
|
||||
time.sleep(500 * time.millisecond)
|
||||
}
|
||||
ts.failed = true
|
||||
if details.flaky && !testing.fail_flaky {
|
||||
ts.append_message(.info, ' *FAILURE* of the known flaky test file $relative_file is ignored, since VTEST_FAIL_FLAKY is 0 . Retry count: $details.retry .')
|
||||
unsafe {
|
||||
goto test_passed_system
|
||||
}
|
||||
}
|
||||
ts.benchmark.fail()
|
||||
tls_bench.fail()
|
||||
ts.add_failed_cmd(cmd)
|
||||
|
@ -386,7 +395,6 @@ fn worker_trunner(mut p pool.PoolProcessor, idx int, thread_id int) voidptr {
|
|||
}
|
||||
mut r := os.execute(cmd)
|
||||
if r.exit_code < 0 {
|
||||
ts.failed = true
|
||||
ts.benchmark.fail()
|
||||
tls_bench.fail()
|
||||
ts.append_message(.fail, tls_bench.step_message_fail(normalised_relative_file))
|
||||
|
@ -397,7 +405,7 @@ fn worker_trunner(mut p pool.PoolProcessor, idx int, thread_id int) voidptr {
|
|||
details := get_test_details(file)
|
||||
os.setenv('VTEST_RETRY_MAX', '$details.retry', true)
|
||||
for retry := 1; retry <= details.retry; retry++ {
|
||||
ts.append_message(.info, ' retrying $retry/$details.retry of $relative_file ...')
|
||||
ts.append_message(.info, ' retrying $retry/$details.retry of $relative_file ; known flaky: $details.flaky ...')
|
||||
os.setenv('VTEST_RETRY', '$retry', true)
|
||||
r = os.execute(cmd)
|
||||
if r.exit_code == 0 {
|
||||
|
@ -406,7 +414,12 @@ fn worker_trunner(mut p pool.PoolProcessor, idx int, thread_id int) voidptr {
|
|||
}
|
||||
}
|
||||
}
|
||||
ts.failed = true
|
||||
if details.flaky && !testing.fail_flaky {
|
||||
ts.append_message(.info, ' *FAILURE* of the known flaky test file $relative_file is ignored, since VTEST_FAIL_FLAKY is 0 . Retry count: $details.retry .')
|
||||
unsafe {
|
||||
goto test_passed_execute
|
||||
}
|
||||
}
|
||||
ts.benchmark.fail()
|
||||
tls_bench.fail()
|
||||
ending_newline := if r.output.ends_with('\n') { '\n' } else { '' }
|
||||
|
@ -421,10 +434,8 @@ fn worker_trunner(mut p pool.PoolProcessor, idx int, thread_id int) voidptr {
|
|||
}
|
||||
}
|
||||
}
|
||||
if os.exists(generated_binary_fpath) {
|
||||
if ts.rm_binaries {
|
||||
os.rm(generated_binary_fpath) or {}
|
||||
}
|
||||
if produces_file_output && os.exists(generated_binary_fpath) && ts.rm_binaries {
|
||||
os.rm(generated_binary_fpath) or {}
|
||||
}
|
||||
return pool.no_result
|
||||
}
|
||||
|
@ -457,7 +468,7 @@ pub fn prepare_test_session(zargs string, folder string, oskipped []string, main
|
|||
// for example module import tests, or subtests, that are compiled/run by other parent tests
|
||||
// in specific configurations, etc.
|
||||
if fnormalised.contains('testdata/') || fnormalised.contains('modules/')
|
||||
|| f.contains('preludes/') {
|
||||
|| fnormalised.contains('preludes/') {
|
||||
continue
|
||||
}
|
||||
$if windows {
|
||||
|
@ -475,7 +486,8 @@ pub fn prepare_test_session(zargs string, folder string, oskipped []string, main
|
|||
skipped << skipped_f
|
||||
}
|
||||
for skip_prefix in oskipped {
|
||||
if f.starts_with(skip_prefix) {
|
||||
skip_folder := skip_prefix + '/'
|
||||
if fnormalised.starts_with(skip_folder) {
|
||||
continue next_file
|
||||
}
|
||||
}
|
||||
|
@ -495,7 +507,7 @@ pub fn v_build_failing_skipped(zargs string, folder string, oskipped []string, c
|
|||
cb(mut session)
|
||||
session.test()
|
||||
eprintln(session.benchmark.total_message(finish_label))
|
||||
return session.failed
|
||||
return session.failed_cmds.len > 0
|
||||
}
|
||||
|
||||
pub fn build_v_cmd_failed(cmd string) bool {
|
||||
|
@ -549,6 +561,7 @@ pub fn eheader(msg string) {
|
|||
|
||||
pub fn header(msg string) {
|
||||
println(term.header_left(msg, '-'))
|
||||
flush_stdout()
|
||||
}
|
||||
|
||||
pub fn setup_new_vtmp_folder() string {
|
||||
|
@ -562,6 +575,7 @@ pub fn setup_new_vtmp_folder() string {
|
|||
pub struct TestDetails {
|
||||
pub mut:
|
||||
retry int
|
||||
flaky bool // when flaky tests fail, the whole run is still considered successfull, unless VTEST_FAIL_FLAKY is 1
|
||||
}
|
||||
|
||||
pub fn get_test_details(file string) TestDetails {
|
||||
|
@ -571,6 +585,9 @@ pub fn get_test_details(file string) TestDetails {
|
|||
if line.starts_with('// vtest retry:') {
|
||||
res.retry = line.all_after(':').trim_space().int()
|
||||
}
|
||||
if line.starts_with('// vtest flaky:') {
|
||||
res.flaky = line.all_after(':').trim_space().bool()
|
||||
}
|
||||
}
|
||||
return res
|
||||
}
|
||||
|
|
|
@ -125,7 +125,7 @@ fn main() {
|
|||
should_sync := fp.bool('cache-sync', `s`, false, 'Update the local cache')
|
||||
context.is_bisect = fp.bool('bisect', `b`, false, 'Bisect mode. Use the current commit in the repo where oldv is.')
|
||||
if !should_sync && !context.is_bisect {
|
||||
fp.limit_free_args(1, 1) ?
|
||||
fp.limit_free_args(1, 1)?
|
||||
}
|
||||
////
|
||||
context.cleanup = fp.bool('clean', 0, false, 'Clean before running (slower).')
|
||||
|
|
|
@ -194,7 +194,7 @@ fn main() {
|
|||
fp.description(tool_description)
|
||||
fp.arguments_description('COMMIT_BEFORE [COMMIT_AFTER]')
|
||||
fp.skip_executable()
|
||||
fp.limit_free_args(1, 2) ?
|
||||
fp.limit_free_args(1, 2)?
|
||||
context.vflags = fp.string('vflags', 0, '', 'Additional options to pass to the v commands, for example "-cc tcc"')
|
||||
context.hyperfineopts = fp.string('hyperfine_options', 0, '', 'Additional options passed to hyperfine.
|
||||
${flag.space}For example on linux, you may want to pass:
|
||||
|
|
|
@ -8,7 +8,7 @@ const oldvexe = fullpath(tools_folder, 'oldv')
|
|||
|
||||
const oldv_source = fullpath(tools_folder, 'oldv.v')
|
||||
|
||||
const vroot = os.real_path(os.dir(tools_folder))
|
||||
const vroot = os.real_path(os.dir(os.dir(tools_folder)))
|
||||
|
||||
const vexe = fullpath(vroot, 'v')
|
||||
|
||||
|
|
|
@ -143,7 +143,7 @@ const (
|
|||
|
||||
fn main() {
|
||||
mut context := Context{}
|
||||
context.parse_options() ?
|
||||
context.parse_options()?
|
||||
context.run()
|
||||
context.show_diff_summary()
|
||||
}
|
||||
|
@ -155,7 +155,7 @@ fn (mut context Context) parse_options() ? {
|
|||
fp.description('Repeat command(s) and collect statistics. Note: you have to quote each command, if it contains spaces.')
|
||||
fp.arguments_description('CMD1 CMD2 ...')
|
||||
fp.skip_executable()
|
||||
fp.limit_free_args_to_at_least(1) ?
|
||||
fp.limit_free_args_to_at_least(1)?
|
||||
context.count = fp.int('count', `c`, 10, 'Repetition count.')
|
||||
context.series = fp.int('series', `s`, 2, 'Series count. `-s 2 -c 4 a b` => aaaabbbbaaaabbbb, while `-s 3 -c 2 a b` => aabbaabbaabb.')
|
||||
context.warmup = fp.int('warmup', `w`, 2, 'Warmup runs. These are done *only at the start*, and are ignored.')
|
||||
|
@ -200,8 +200,13 @@ fn (mut context Context) parse_options() ? {
|
|||
}
|
||||
}
|
||||
|
||||
fn flushed_print(s string) {
|
||||
print(s)
|
||||
flush_stdout()
|
||||
}
|
||||
|
||||
fn (mut context Context) clear_line() {
|
||||
print(context.cline)
|
||||
flushed_print(context.cline)
|
||||
}
|
||||
|
||||
fn (mut context Context) expand_all_commands(commands []string) []string {
|
||||
|
@ -247,7 +252,7 @@ fn (mut context Context) run() {
|
|||
println('Series: ${si:4}/${context.series:-4}, command: $cmd')
|
||||
if context.warmup > 0 && run_warmups < context.commands.len {
|
||||
for i in 1 .. context.warmup + 1 {
|
||||
print('${context.cgoback}warming up run: ${i:4}/${context.warmup:-4} for ${cmd:-50s} took ${duration:6} ms ...')
|
||||
flushed_print('${context.cgoback}warming up run: ${i:4}/${context.warmup:-4} for ${cmd:-50s} took ${duration:6} ms ...')
|
||||
mut sw := time.new_stopwatch()
|
||||
res := os.execute(cmd)
|
||||
if res.exit_code != 0 {
|
||||
|
@ -260,9 +265,9 @@ fn (mut context Context) run() {
|
|||
context.clear_line()
|
||||
for i in 1 .. (context.count + 1) {
|
||||
avg := f64(sum) / f64(i)
|
||||
print('${context.cgoback}Average: ${avg:9.3f}ms | run: ${i:4}/${context.count:-4} | took ${duration:6} ms')
|
||||
flushed_print('${context.cgoback}Average: ${avg:9.3f}ms | run: ${i:4}/${context.count:-4} | took ${duration:6} ms')
|
||||
if context.show_output {
|
||||
print(' | result: ${oldres:s}')
|
||||
flushed_print(' | result: ${oldres:s}')
|
||||
}
|
||||
mut sw := time.new_stopwatch()
|
||||
res := scripting.exec(cmd) or { continue }
|
||||
|
@ -288,7 +293,7 @@ fn (mut context Context) run() {
|
|||
context.results[icmd].atiming = new_aints(context.results[icmd].timings, context.nmins,
|
||||
context.nmaxs)
|
||||
context.clear_line()
|
||||
print(context.cgoback)
|
||||
flushed_print(context.cgoback)
|
||||
mut m := map[string][]int{}
|
||||
ioutputs := context.results[icmd].outputs
|
||||
for o in ioutputs {
|
||||
|
@ -358,7 +363,7 @@ fn (mut context Context) show_diff_summary() {
|
|||
println('context: $context')
|
||||
}
|
||||
if int(base) > context.fail_on_maxtime {
|
||||
print(performance_regression_label)
|
||||
flushed_print(performance_regression_label)
|
||||
println('average time: ${base:6.1f} ms > $context.fail_on_maxtime ms threshold.')
|
||||
exit(2)
|
||||
}
|
||||
|
@ -367,7 +372,7 @@ fn (mut context Context) show_diff_summary() {
|
|||
}
|
||||
fail_threshold_max := f64(context.fail_on_regress_percent)
|
||||
if first_cmd_percentage > fail_threshold_max {
|
||||
print(performance_regression_label)
|
||||
flushed_print(performance_regression_label)
|
||||
println('${first_cmd_percentage:5.1f}% > ${fail_threshold_max:5.1f}% threshold.')
|
||||
exit(3)
|
||||
}
|
||||
|
|
|
@ -42,7 +42,7 @@ fn cleanup_tdir() {
|
|||
|
||||
fn create_test(tname string, tcontent string) ?string {
|
||||
tpath := os.join_path(tdir, tname)
|
||||
os.write_file(tpath, tcontent) ?
|
||||
os.write_file(tpath, tcontent)?
|
||||
eprintln('>>>>>>>> tpath: $tpath | tcontent: $tcontent')
|
||||
return tpath
|
||||
}
|
||||
|
@ -52,17 +52,17 @@ fn main() {
|
|||
os.chdir(os.wd_at_startup) or {}
|
||||
}
|
||||
println('> vroot: $vroot | vexe: $vexe | tdir: $tdir')
|
||||
ok_fpath := create_test('a_single_ok_test.v', 'fn test_ok(){ assert true }') ?
|
||||
ok_fpath := create_test('a_single_ok_test.v', 'fn test_ok(){ assert true }')?
|
||||
check_ok('"$vexe" "$ok_fpath"')
|
||||
check_ok('"$vexe" test "$ok_fpath"')
|
||||
check_ok('"$vexe" test "$tdir"')
|
||||
fail_fpath := create_test('a_single_failing_test.v', 'fn test_fail(){ assert 1 == 2 }') ?
|
||||
fail_fpath := create_test('a_single_failing_test.v', 'fn test_fail(){ assert 1 == 2 }')?
|
||||
check_fail('"$vexe" "$fail_fpath"')
|
||||
check_fail('"$vexe" test "$fail_fpath"')
|
||||
check_fail('"$vexe" test "$tdir"')
|
||||
rel_dir := os.join_path(tdir, rand.ulid())
|
||||
os.mkdir(rel_dir) ?
|
||||
os.chdir(rel_dir) ?
|
||||
os.mkdir(rel_dir)?
|
||||
os.chdir(rel_dir)?
|
||||
check_ok('"$vexe" test "..${os.path_separator + os.base(ok_fpath)}"')
|
||||
println('> all done')
|
||||
}
|
||||
|
|
|
@ -0,0 +1,50 @@
|
|||
// Copyright (c) 2019-2022 Alexander Medvednikov. All rights reserved.
|
||||
// Use of this source code is governed by an MIT license that can be found in the LICENSE file.
|
||||
module main
|
||||
|
||||
import os
|
||||
import v.util
|
||||
|
||||
const vexe = os.getenv('VEXE')
|
||||
|
||||
fn main() {
|
||||
vmodules := os.vmodules_dir()
|
||||
c2v_dir := os.join_path(vmodules, 'c2v')
|
||||
mut c2v_bin := os.join_path(c2v_dir, 'c2v')
|
||||
$if windows {
|
||||
c2v_bin += '.exe'
|
||||
}
|
||||
// Git clone c2v
|
||||
if !os.exists(c2v_dir) {
|
||||
println('C2V is not installed. Cloning C2V to $c2v_dir ...')
|
||||
os.chdir(vmodules)?
|
||||
res := os.execute('git clone https://github.com/vlang/c2v')
|
||||
if res.exit_code != 0 {
|
||||
eprintln('Failed to download C2V.')
|
||||
exit(1)
|
||||
}
|
||||
}
|
||||
// Compile c2v
|
||||
if !os.exists(c2v_bin) {
|
||||
os.chdir(c2v_dir)?
|
||||
println('Compiling c2v ...')
|
||||
res2 := os.execute('${os.quoted_path(vexe)} -o ${os.quoted_path(c2v_bin)} -keepc -g -experimental .')
|
||||
if res2.exit_code != 0 {
|
||||
eprintln(res2.output)
|
||||
eprintln('Failed to compile C2V. This should not happen. Please report it via GitHub.')
|
||||
exit(2)
|
||||
}
|
||||
}
|
||||
if os.args.len < 3 {
|
||||
eprintln('Wrong number of arguments. Use `v translate file.c` .')
|
||||
exit(3)
|
||||
}
|
||||
passed_args := util.args_quote_paths(os.args[2..])
|
||||
// println(passed_args)
|
||||
os.chdir(os.wd_at_startup)?
|
||||
res := os.system('$c2v_bin $passed_args')
|
||||
if res != 0 {
|
||||
eprintln('C2V failed to translate the C files. Please report it via GitHub.')
|
||||
exit(4)
|
||||
}
|
||||
}
|
|
@ -24,7 +24,7 @@ fn C.cJSON_CreateNull() &C.cJSON
|
|||
|
||||
// fn C.cJSON_CreateNumber() &C.cJSON
|
||||
// fn C.cJSON_CreateString() &C.cJSON
|
||||
fn C.cJSON_CreateRaw(&byte) &C.cJSON
|
||||
fn C.cJSON_CreateRaw(&u8) &C.cJSON
|
||||
|
||||
fn C.cJSON_IsInvalid(voidptr) bool
|
||||
|
||||
|
@ -45,13 +45,13 @@ fn C.cJSON_IsObject(voidptr) bool
|
|||
|
||||
fn C.cJSON_IsRaw(voidptr) bool
|
||||
|
||||
fn C.cJSON_AddItemToObject(voidptr, &byte, voidptr)
|
||||
fn C.cJSON_AddItemToObject(voidptr, &u8, voidptr)
|
||||
|
||||
fn C.cJSON_AddItemToArray(voidptr, voidptr)
|
||||
|
||||
fn C.cJSON_Delete(voidptr)
|
||||
|
||||
fn C.cJSON_Print(voidptr) &byte
|
||||
fn C.cJSON_Print(voidptr) &u8
|
||||
|
||||
[inline]
|
||||
fn create_object() &C.cJSON {
|
||||
|
|
|
@ -44,7 +44,7 @@ fn main() {
|
|||
for hf in hfields.split(',') {
|
||||
ctx.hide_names[hf] = true
|
||||
}
|
||||
fp.limit_free_args_to_at_least(1) ?
|
||||
fp.limit_free_args_to_at_least(1)?
|
||||
rest_of_args := fp.remaining_parameters()
|
||||
for vfile in rest_of_args {
|
||||
file := get_abs_path(vfile)
|
||||
|
@ -283,7 +283,7 @@ fn (t Tree) embed_file(node ast.EmbeddedFile) &Node {
|
|||
obj.add('compression_type', t.string_node(node.compression_type))
|
||||
obj.add('is_compressed', t.bool_node(node.is_compressed))
|
||||
obj.add('len', t.number_node(node.len))
|
||||
obj.add('bytes', t.array_node_byte(node.bytes))
|
||||
obj.add('bytes', t.array_node_u8(node.bytes))
|
||||
return obj
|
||||
}
|
||||
|
||||
|
@ -1216,7 +1216,7 @@ fn (t Tree) string_inter_literal(node ast.StringInterLiteral) &Node {
|
|||
obj.add_terse('pluss', t.array_node_bool(node.pluss))
|
||||
obj.add_terse('fills', t.array_node_bool(node.fills))
|
||||
obj.add_terse('fmt_poss', t.array_node_position(node.fmt_poss))
|
||||
obj.add_terse('fmts', t.array_node_byte(node.fmts))
|
||||
obj.add_terse('fmts', t.array_node_u8(node.fmts))
|
||||
obj.add_terse('need_fmts', t.array_node_bool(node.need_fmts))
|
||||
obj.add('pos', t.pos(node.pos))
|
||||
return obj
|
||||
|
@ -1358,6 +1358,7 @@ fn (t Tree) postfix_expr(node ast.PostfixExpr) &Node {
|
|||
obj.add_terse('expr', t.expr(node.expr))
|
||||
obj.add('auto_locked', t.string_node(node.auto_locked))
|
||||
obj.add('pos', t.pos(node.pos))
|
||||
obj.add('is_c2v_prefix', t.bool_node(node.is_c2v_prefix))
|
||||
return obj
|
||||
}
|
||||
|
||||
|
@ -2209,7 +2210,7 @@ fn (t Tree) array_node_int(nodes []int) &Node {
|
|||
return arr
|
||||
}
|
||||
|
||||
fn (t Tree) array_node_byte(nodes []byte) &Node {
|
||||
fn (t Tree) array_node_u8(nodes []u8) &Node {
|
||||
mut arr := new_array()
|
||||
for node in nodes {
|
||||
arr.add_item(t.number_node(node))
|
||||
|
|
|
@ -46,12 +46,12 @@ fn (context Context) footer() string {
|
|||
return ')\n'
|
||||
}
|
||||
|
||||
fn (context Context) file2v(bname string, fbytes []byte, bn_max int) string {
|
||||
fn (context Context) file2v(bname string, fbytes []u8, bn_max int) string {
|
||||
mut sb := strings.new_builder(1000)
|
||||
bn_diff_len := bn_max - bname.len
|
||||
sb.write_string('\t${bname}_len' + ' '.repeat(bn_diff_len - 4) + ' = $fbytes.len\n')
|
||||
fbyte := fbytes[0]
|
||||
bnmae_line := '\t$bname' + ' '.repeat(bn_diff_len) + ' = [byte($fbyte), '
|
||||
bnmae_line := '\t$bname' + ' '.repeat(bn_diff_len) + ' = [u8($fbyte), '
|
||||
sb.write_string(bnmae_line)
|
||||
mut line_len := bnmae_line.len + 3
|
||||
for i := 1; i < fbytes.len; i++ {
|
||||
|
@ -73,11 +73,11 @@ fn (context Context) file2v(bname string, fbytes []byte, bn_max int) string {
|
|||
return sb.str()
|
||||
}
|
||||
|
||||
fn (context Context) bname_and_bytes(file string) ?(string, []byte) {
|
||||
fn (context Context) bname_and_bytes(file string) ?(string, []u8) {
|
||||
fname := os.file_name(file)
|
||||
fname_escaped := fname.replace_each(['.', '_', '-', '_'])
|
||||
byte_name := '$context.prefix$fname_escaped'.to_lower()
|
||||
fbytes := os.read_bytes(file) or { return error('Error: $err.msg') }
|
||||
fbytes := os.read_bytes(file) or { return error('Error: $err.msg()') }
|
||||
return byte_name, fbytes
|
||||
}
|
||||
|
||||
|
@ -108,7 +108,7 @@ fn main() {
|
|||
exit(0)
|
||||
}
|
||||
files := fp.finalize() or {
|
||||
eprintln('Error: $err.msg')
|
||||
eprintln('Error: $err.msg()')
|
||||
exit(1)
|
||||
}
|
||||
real_files := files.filter(it != 'bin2v')
|
||||
|
@ -120,22 +120,22 @@ fn main() {
|
|||
if context.write_file != '' && os.file_ext(context.write_file) !in ['.vv', '.v'] {
|
||||
context.write_file += '.v'
|
||||
}
|
||||
mut file_byte_map := map[string][]byte{}
|
||||
mut file_byte_map := map[string][]u8{}
|
||||
for file in real_files {
|
||||
bname, fbytes := context.bname_and_bytes(file) or {
|
||||
eprintln(err.msg)
|
||||
eprintln(err.msg())
|
||||
exit(1)
|
||||
}
|
||||
file_byte_map[bname] = fbytes
|
||||
}
|
||||
max_bname := context.max_bname_len(file_byte_map.keys())
|
||||
if context.write_file.len > 0 {
|
||||
mut out_file := os.create(context.write_file) ?
|
||||
out_file.write_string(context.header()) ?
|
||||
mut out_file := os.create(context.write_file)?
|
||||
out_file.write_string(context.header())?
|
||||
for bname, fbytes in file_byte_map {
|
||||
out_file.write_string(context.file2v(bname, fbytes, max_bname)) ?
|
||||
out_file.write_string(context.file2v(bname, fbytes, max_bname))?
|
||||
}
|
||||
out_file.write_string(context.footer()) ?
|
||||
out_file.write_string(context.footer())?
|
||||
} else {
|
||||
print(context.header())
|
||||
for bname, fbytes in file_byte_map {
|
||||
|
|
|
@ -12,7 +12,8 @@ const efolders = [
|
|||
fn main() {
|
||||
args_string := os.args[1..].join(' ')
|
||||
params := args_string.all_before('build-examples')
|
||||
skip_prefixes := efolders.map(os.real_path(os.join_path_single(vroot, it)))
|
||||
skip_prefixes := efolders.map(os.real_path(os.join_path_single(vroot, it)).replace('\\',
|
||||
'/'))
|
||||
res := testing.v_build_failing_skipped(params, 'examples', skip_prefixes, fn (mut session testing.TestSession) {
|
||||
for x in efolders {
|
||||
pathsegments := x.split_any('/')
|
||||
|
|
|
@ -23,7 +23,7 @@ fn main() {
|
|||
args_string := os.args[1..].join(' ')
|
||||
vexe := os.getenv('VEXE')
|
||||
vroot := os.dir(vexe)
|
||||
os.chdir(vroot) ?
|
||||
os.chdir(vroot)?
|
||||
folder := os.join_path('cmd', 'tools')
|
||||
tfolder := os.join_path(vroot, 'cmd', 'tools')
|
||||
main_label := 'Building $folder ...'
|
||||
|
@ -31,7 +31,7 @@ fn main() {
|
|||
//
|
||||
mut skips := []string{}
|
||||
for stool in tools_in_subfolders {
|
||||
skips << os.join_path(tfolder, stool)
|
||||
skips << os.join_path(tfolder, stool).replace('\\', '/')
|
||||
}
|
||||
buildopts := args_string.all_before('build-tools')
|
||||
mut session := testing.prepare_test_session(buildopts, folder, skips, main_label)
|
||||
|
@ -43,11 +43,11 @@ fn main() {
|
|||
// eprintln('> session.skip_files: $session.skip_files')
|
||||
session.test()
|
||||
eprintln(session.benchmark.total_message(finish_label))
|
||||
if session.failed {
|
||||
if session.failed_cmds.len > 0 {
|
||||
exit(1)
|
||||
}
|
||||
//
|
||||
mut executables := os.ls(session.vtmp_dir) ?
|
||||
mut executables := os.ls(session.vtmp_dir)?
|
||||
executables.sort()
|
||||
for texe in executables {
|
||||
tname := texe.replace(os.file_ext(texe), '')
|
||||
|
@ -66,8 +66,9 @@ fn main() {
|
|||
}
|
||||
target_path := os.join_path(tfolder, texe)
|
||||
os.mv_by_cp(tpath, target_path) or {
|
||||
if !err.msg.contains('vbuild-tools') && !err.msg.contains('vtest-all') {
|
||||
eprintln('error while moving $tpath to $target_path: $err.msg')
|
||||
emsg := err.msg()
|
||||
if !emsg.contains('vbuild-tools') && !emsg.contains('vtest-all') {
|
||||
eprintln('error while moving $tpath to $target_path: $emsg')
|
||||
}
|
||||
continue
|
||||
}
|
||||
|
|
|
@ -20,7 +20,7 @@ const (
|
|||
tool_version = \'1.2.1\'
|
||||
version: \'0.2.42\'
|
||||
VERSION = "1.23.8"
|
||||
|
||||
|
||||
Examples:
|
||||
Bump the patch version in v.mod if it exists
|
||||
v bump --patch
|
||||
|
|
|
@ -68,21 +68,21 @@ fn run_individual_test(case BumpTestCase) ? {
|
|||
test_file := os.join_path_single(temp_dir, case.file_name)
|
||||
|
||||
os.rm(test_file) or {}
|
||||
os.write_file(test_file, case.contents) ?
|
||||
os.write_file(test_file, case.contents)?
|
||||
//
|
||||
os.execute_or_exit('${os.quoted_path(vexe)} bump --patch ${os.quoted_path(test_file)}')
|
||||
patch_lines := os.read_lines(test_file) ?
|
||||
patch_lines := os.read_lines(test_file)?
|
||||
assert patch_lines[case.line] == case.expected_patch
|
||||
|
||||
os.execute_or_exit('${os.quoted_path(vexe)} bump --minor ${os.quoted_path(test_file)}')
|
||||
minor_lines := os.read_lines(test_file) ?
|
||||
minor_lines := os.read_lines(test_file)?
|
||||
assert minor_lines[case.line] == case.expected_minor
|
||||
|
||||
os.execute_or_exit('${os.quoted_path(vexe)} bump --major ${os.quoted_path(test_file)}')
|
||||
major_lines := os.read_lines(test_file) ?
|
||||
major_lines := os.read_lines(test_file)?
|
||||
assert major_lines[case.line] == case.expected_major
|
||||
//
|
||||
os.rm(test_file) ?
|
||||
os.rm(test_file)?
|
||||
}
|
||||
|
||||
fn test_all_bump_cases() {
|
||||
|
|
|
@ -76,7 +76,7 @@ SUBCMD:
|
|||
|
||||
// Snooped from cmd/v/v.v, vlib/v/pref/pref.v
|
||||
const (
|
||||
auto_complete_commands = [
|
||||
auto_complete_commands = [
|
||||
// simple_cmd
|
||||
'ast',
|
||||
'doc',
|
||||
|
@ -114,7 +114,6 @@ const (
|
|||
'help',
|
||||
'new',
|
||||
'init',
|
||||
'complete',
|
||||
'translate',
|
||||
'self',
|
||||
'search',
|
||||
|
@ -130,8 +129,13 @@ const (
|
|||
'run',
|
||||
'build',
|
||||
'build-module',
|
||||
'missdoc',
|
||||
]
|
||||
auto_complete_flags = [
|
||||
// Entries in the flag arrays below should be entered as is:
|
||||
// * Short flags, e.g.: "-v", should be entered: '-v'
|
||||
// * Long flags, e.g.: "--version", should be entered: '--version'
|
||||
// * Single-dash flags, e.g.: "-version", should be entered: '-version'
|
||||
auto_complete_flags = [
|
||||
'-apk',
|
||||
'-show-timings',
|
||||
'-check-syntax',
|
||||
|
@ -150,6 +154,7 @@ const (
|
|||
'-autofree',
|
||||
'-compress',
|
||||
'-freestanding',
|
||||
'-no-parallel',
|
||||
'-no-preludes',
|
||||
'-prof',
|
||||
'-profile',
|
||||
|
@ -190,7 +195,7 @@ const (
|
|||
'-version',
|
||||
'--version',
|
||||
]
|
||||
auto_complete_flags_doc = [
|
||||
auto_complete_flags_doc = [
|
||||
'-all',
|
||||
'-f',
|
||||
'-h',
|
||||
|
@ -209,7 +214,7 @@ const (
|
|||
'-s',
|
||||
'-l',
|
||||
]
|
||||
auto_complete_flags_fmt = [
|
||||
auto_complete_flags_fmt = [
|
||||
'-c',
|
||||
'-diff',
|
||||
'-l',
|
||||
|
@ -217,7 +222,7 @@ const (
|
|||
'-debug',
|
||||
'-verify',
|
||||
]
|
||||
auto_complete_flags_bin2v = [
|
||||
auto_complete_flags_bin2v = [
|
||||
'-h',
|
||||
'--help',
|
||||
'-m',
|
||||
|
@ -227,22 +232,46 @@ const (
|
|||
'-w',
|
||||
'--write',
|
||||
]
|
||||
auto_complete_flags_shader = [
|
||||
'help',
|
||||
'h',
|
||||
'force-update',
|
||||
'u',
|
||||
'verbose',
|
||||
'v',
|
||||
'slang',
|
||||
'l',
|
||||
'output',
|
||||
'o',
|
||||
auto_complete_flags_shader = [
|
||||
'--help',
|
||||
'-h',
|
||||
'--force-update',
|
||||
'-u',
|
||||
'--verbose',
|
||||
'-v',
|
||||
'--slang',
|
||||
'-l',
|
||||
'--output',
|
||||
'-o',
|
||||
]
|
||||
auto_complete_flags_self = [
|
||||
auto_complete_flags_missdoc = [
|
||||
'--help',
|
||||
'-h',
|
||||
'--tags',
|
||||
'-t',
|
||||
'--deprecated',
|
||||
'-d',
|
||||
'--private',
|
||||
'-p',
|
||||
'--no-line-numbers',
|
||||
'-n',
|
||||
'--exclude',
|
||||
'-e',
|
||||
'--relative-paths',
|
||||
'-r',
|
||||
'--js',
|
||||
'--verify',
|
||||
'--diff',
|
||||
]
|
||||
auto_complete_flags_bump = [
|
||||
'--patch',
|
||||
'--minor',
|
||||
'--major',
|
||||
]
|
||||
auto_complete_flags_self = [
|
||||
'-prod',
|
||||
]
|
||||
auto_complete_compilers = [
|
||||
auto_complete_compilers = [
|
||||
'cc',
|
||||
'gcc',
|
||||
'tcc',
|
||||
|
@ -372,12 +401,17 @@ fn auto_complete_request(args []string) []string {
|
|||
parent_command = parts[i]
|
||||
break
|
||||
}
|
||||
get_flags := fn (base []string, flag string) []string {
|
||||
if flag.len == 1 { return base
|
||||
} else { return base.filter(it.starts_with(flag))
|
||||
}
|
||||
}
|
||||
if part.starts_with('-') { // 'v -<tab>' -> flags.
|
||||
if part.starts_with('-') { // 'v [subcmd] -<tab>' or 'v [subcmd] --<tab>'-> flags.
|
||||
get_flags := fn (base []string, flag string) []string {
|
||||
mut results := []string{}
|
||||
for entry in base {
|
||||
if entry.starts_with(flag) {
|
||||
results << entry
|
||||
}
|
||||
}
|
||||
return results
|
||||
}
|
||||
|
||||
match parent_command {
|
||||
'bin2v' { // 'v bin2v -<tab>'
|
||||
list = get_flags(auto_complete_flags_bin2v, part)
|
||||
|
@ -397,6 +431,12 @@ fn auto_complete_request(args []string) []string {
|
|||
'shader' { // 'v shader -<tab>' -> flags.
|
||||
list = get_flags(auto_complete_flags_shader, part)
|
||||
}
|
||||
'missdoc' { // 'v missdoc -<tab>' -> flags.
|
||||
list = get_flags(auto_complete_flags_missdoc, part)
|
||||
}
|
||||
'bump' { // 'v bump -<tab>' -> flags.
|
||||
list = get_flags(auto_complete_flags_bump, part)
|
||||
}
|
||||
else {
|
||||
for flag in auto_complete_flags {
|
||||
if flag == part {
|
||||
|
@ -414,6 +454,11 @@ fn auto_complete_request(args []string) []string {
|
|||
}
|
||||
}
|
||||
}
|
||||
// Clear the list if the result is identical to the part examined
|
||||
// (the flag must have already been completed)
|
||||
if list.len == 1 && part == list[0] {
|
||||
list.clear()
|
||||
}
|
||||
} else {
|
||||
match part {
|
||||
'help' { // 'v help <tab>' -> top level commands except "help".
|
||||
|
|
|
@ -160,7 +160,7 @@ fn create(args []string) {
|
|||
if c.version == '' {
|
||||
c.version = default_version
|
||||
}
|
||||
default_license := 'MIT'
|
||||
default_license := os.getenv_opt('VLICENSE') or { 'MIT' }
|
||||
c.license = os.input('Input your project license: ($default_license) ')
|
||||
if c.license == '' {
|
||||
c.license = default_license
|
||||
|
|
|
@ -5,7 +5,7 @@ const test_path = 'vcreate_test'
|
|||
fn init_and_check() ? {
|
||||
os.execute_or_exit('${os.quoted_path(@VEXE)} init')
|
||||
|
||||
assert os.read_file('vcreate_test.v') ? == [
|
||||
assert os.read_file('vcreate_test.v')? == [
|
||||
'module main\n',
|
||||
'fn main() {',
|
||||
" println('Hello World!')",
|
||||
|
@ -13,7 +13,7 @@ fn init_and_check() ? {
|
|||
'',
|
||||
].join_lines()
|
||||
|
||||
assert os.read_file('v.mod') ? == [
|
||||
assert os.read_file('v.mod')? == [
|
||||
'Module {',
|
||||
" name: 'vcreate_test'",
|
||||
" description: ''",
|
||||
|
@ -24,7 +24,7 @@ fn init_and_check() ? {
|
|||
'',
|
||||
].join_lines()
|
||||
|
||||
assert os.read_file('.gitignore') ? == [
|
||||
assert os.read_file('.gitignore')? == [
|
||||
'# Binaries for programs and plugins',
|
||||
'main',
|
||||
'vcreate_test',
|
||||
|
@ -37,7 +37,7 @@ fn init_and_check() ? {
|
|||
'',
|
||||
].join_lines()
|
||||
|
||||
assert os.read_file('.gitattributes') ? == [
|
||||
assert os.read_file('.gitattributes')? == [
|
||||
'*.v linguist-language=V text=auto eol=lf',
|
||||
'*.vv linguist-language=V text=auto eol=lf',
|
||||
'*.vsh linguist-language=V text=auto eol=lf',
|
||||
|
@ -45,7 +45,7 @@ fn init_and_check() ? {
|
|||
'',
|
||||
].join_lines()
|
||||
|
||||
assert os.read_file('.editorconfig') ? == [
|
||||
assert os.read_file('.editorconfig')? == [
|
||||
'[*]',
|
||||
'charset = utf-8',
|
||||
'end_of_line = lf',
|
||||
|
@ -66,9 +66,9 @@ fn test_v_init() ? {
|
|||
defer {
|
||||
os.rmdir_all(dir) or {}
|
||||
}
|
||||
os.chdir(dir) ?
|
||||
os.chdir(dir)?
|
||||
|
||||
init_and_check() ?
|
||||
init_and_check()?
|
||||
}
|
||||
|
||||
fn test_v_init_in_git_dir() ? {
|
||||
|
@ -78,24 +78,24 @@ fn test_v_init_in_git_dir() ? {
|
|||
defer {
|
||||
os.rmdir_all(dir) or {}
|
||||
}
|
||||
os.chdir(dir) ?
|
||||
os.chdir(dir)?
|
||||
os.execute_or_exit('git init .')
|
||||
init_and_check() ?
|
||||
init_and_check()?
|
||||
}
|
||||
|
||||
fn test_v_init_no_overwrite_gitignore() ? {
|
||||
dir := os.join_path(os.temp_dir(), test_path)
|
||||
os.rmdir_all(dir) or {}
|
||||
os.mkdir(dir) or {}
|
||||
os.write_file('$dir/.gitignore', 'blah') ?
|
||||
os.write_file('$dir/.gitignore', 'blah')?
|
||||
defer {
|
||||
os.rmdir_all(dir) or {}
|
||||
}
|
||||
os.chdir(dir) ?
|
||||
os.chdir(dir)?
|
||||
|
||||
os.execute_or_exit('${os.quoted_path(@VEXE)} init')
|
||||
|
||||
assert os.read_file('.gitignore') ? == 'blah'
|
||||
assert os.read_file('.gitignore')? == 'blah'
|
||||
}
|
||||
|
||||
fn test_v_init_no_overwrite_gitattributes_and_editorconfig() ? {
|
||||
|
@ -114,15 +114,15 @@ indent_size = 4
|
|||
dir := os.join_path(os.temp_dir(), test_path)
|
||||
os.rmdir_all(dir) or {}
|
||||
os.mkdir(dir) or {}
|
||||
os.write_file('$dir/.gitattributes', git_attributes_content) ?
|
||||
os.write_file('$dir/.editorconfig', editor_config_content) ?
|
||||
os.write_file('$dir/.gitattributes', git_attributes_content)?
|
||||
os.write_file('$dir/.editorconfig', editor_config_content)?
|
||||
defer {
|
||||
os.rmdir_all(dir) or {}
|
||||
}
|
||||
os.chdir(dir) ?
|
||||
os.chdir(dir)?
|
||||
|
||||
os.execute_or_exit('${os.quoted_path(@VEXE)} init')
|
||||
|
||||
assert os.read_file('.gitattributes') ? == git_attributes_content
|
||||
assert os.read_file('.editorconfig') ? == editor_config_content
|
||||
assert os.read_file('.gitattributes')? == git_attributes_content
|
||||
assert os.read_file('.editorconfig')? == editor_config_content
|
||||
}
|
||||
|
|
|
@ -12,9 +12,13 @@ import v.doc
|
|||
import v.pref
|
||||
|
||||
const (
|
||||
css_js_assets = ['doc.css', 'normalize.css', 'doc.js', 'dark-mode.js']
|
||||
default_theme = os.resource_abs_path('theme')
|
||||
link_svg = '<svg xmlns="http://www.w3.org/2000/svg" height="24" viewBox="0 0 24 24" width="24"><path d="M0 0h24v24H0z" fill="none"/><path d="M3.9 12c0-1.71 1.39-3.1 3.1-3.1h4V7H7c-2.76 0-5 2.24-5 5s2.24 5 5 5h4v-1.9H7c-1.71 0-3.1-1.39-3.1-3.1zM8 13h8v-2H8v2zm9-6h-4v1.9h4c1.71 0 3.1 1.39 3.1 3.1s-1.39 3.1-3.1 3.1h-4V17h4c2.76 0 5-2.24 5-5s-2.24-5-5-5z"/></svg>'
|
||||
css_js_assets = ['doc.css', 'normalize.css', 'doc.js', 'dark-mode.js']
|
||||
default_theme = os.resource_abs_path('theme')
|
||||
link_svg = '<svg xmlns="http://www.w3.org/2000/svg" height="24" viewBox="0 0 24 24" width="24"><path d="M0 0h24v24H0z" fill="none"/><path d="M3.9 12c0-1.71 1.39-3.1 3.1-3.1h4V7H7c-2.76 0-5 2.24-5 5s2.24 5 5 5h4v-1.9H7c-1.71 0-3.1-1.39-3.1-3.1zM8 13h8v-2H8v2zm9-6h-4v1.9h4c1.71 0 3.1 1.39 3.1 3.1s-1.39 3.1-3.1 3.1h-4V17h4c2.76 0 5-2.24 5-5s-2.24-5-5-5z"/></svg>'
|
||||
|
||||
single_quote = "'"
|
||||
double_quote = '"'
|
||||
no_quotes_replacement = [single_quote, '', double_quote, '']
|
||||
)
|
||||
|
||||
enum HighlightTokenTyp {
|
||||
|
@ -298,6 +302,8 @@ fn html_highlight(code string, tb &ast.Table) string {
|
|||
"'$tok.lit'"
|
||||
} else if typ == .char {
|
||||
'`$tok.lit`'
|
||||
} else if typ == .comment {
|
||||
if tok.lit != '' && tok.lit[0] == 1 { '//${tok.lit[1..]}' } else { '//$tok.lit' }
|
||||
} else {
|
||||
tok.lit
|
||||
}
|
||||
|
@ -320,7 +326,8 @@ fn html_highlight(code string, tb &ast.Table) string {
|
|||
tok_typ = .builtin
|
||||
} else if next_tok.kind == .lcbr {
|
||||
tok_typ = .symbol
|
||||
} else if next_tok.kind == .lpar {
|
||||
} else if next_tok.kind == .lpar || (!tok.lit[0].is_capital()
|
||||
&& next_tok.kind == .lt && next_tok.pos == tok.pos + tok.lit.len) {
|
||||
tok_typ = .function
|
||||
} else {
|
||||
tok_typ = .name
|
||||
|
@ -341,14 +348,15 @@ fn html_highlight(code string, tb &ast.Table) string {
|
|||
.key_true, .key_false {
|
||||
tok_typ = .boolean
|
||||
}
|
||||
.lpar, .lcbr, .rpar, .rcbr, .lsbr, .rsbr, .semicolon, .colon, .comma, .dot {
|
||||
.lpar, .lcbr, .rpar, .rcbr, .lsbr, .rsbr, .semicolon, .colon, .comma, .dot,
|
||||
.dotdot, .ellipsis {
|
||||
tok_typ = .punctuation
|
||||
}
|
||||
else {
|
||||
if token.is_key(tok.lit) || token.is_decl(tok.kind) {
|
||||
tok_typ = .keyword
|
||||
} else if tok.kind == .decl_assign || tok.kind.is_assign() || tok.is_unary()
|
||||
|| tok.kind.is_relational() || tok.kind.is_infix() {
|
||||
|| tok.kind.is_relational() || tok.kind.is_infix() || tok.kind.is_postfix() {
|
||||
tok_typ = .operator
|
||||
}
|
||||
}
|
||||
|
@ -362,7 +370,7 @@ fn html_highlight(code string, tb &ast.Table) string {
|
|||
break
|
||||
}
|
||||
} else {
|
||||
buf.write_byte(code[i])
|
||||
buf.write_u8(code[i])
|
||||
i++
|
||||
}
|
||||
}
|
||||
|
@ -385,8 +393,9 @@ fn doc_node_html(dn doc.DocNode, link string, head bool, include_examples bool,
|
|||
highlighted_code := html_highlight(dn.content, tb)
|
||||
node_class := if dn.kind == .const_group { ' const' } else { '' }
|
||||
sym_name := get_sym_name(dn)
|
||||
has_deprecated := 'deprecated' in dn.tags
|
||||
mut tags := dn.tags.filter(it != 'deprecated')
|
||||
mut deprecated_tags := dn.tags.filter(it.starts_with('deprecated'))
|
||||
deprecated_tags.sort()
|
||||
mut tags := dn.tags.filter(!it.starts_with('deprecated'))
|
||||
tags.sort()
|
||||
mut node_id := get_node_id(dn)
|
||||
mut hash_link := if !head { ' <a href="#$node_id">#</a>' } else { '' }
|
||||
|
@ -406,13 +415,12 @@ fn doc_node_html(dn doc.DocNode, link string, head bool, include_examples bool,
|
|||
}
|
||||
dnw.write_string('</div>')
|
||||
}
|
||||
if tags.len > 0 || has_deprecated {
|
||||
mut attributes := if has_deprecated {
|
||||
'<div class="attribute attribute-deprecated">deprecated</div>'
|
||||
} else {
|
||||
''
|
||||
}
|
||||
attributes += tags.map('<div class="attribute">$it</div>').join('')
|
||||
if deprecated_tags.len > 0 {
|
||||
attributes := deprecated_tags.map('<div class="attribute attribute-deprecated">${no_quotes(it)}</div>').join('')
|
||||
dnw.writeln('<div class="attributes">$attributes</div>')
|
||||
}
|
||||
if tags.len > 0 {
|
||||
attributes := tags.map('<div class="attribute">$it</div>').join('')
|
||||
dnw.writeln('<div class="attributes">$attributes</div>')
|
||||
}
|
||||
if !head && dn.content.len > 0 {
|
||||
|
@ -426,8 +434,8 @@ fn doc_node_html(dn doc.DocNode, link string, head bool, include_examples bool,
|
|||
example_title := if examples.len > 1 { 'Examples' } else { 'Example' }
|
||||
dnw.writeln('<section class="doc-node examples"><h4>$example_title</h4>')
|
||||
for example in examples {
|
||||
// hl_example := html_highlight(example, tb)
|
||||
dnw.writeln('<pre><code class="language-v">$example</code></pre>')
|
||||
hl_example := html_highlight(example, tb)
|
||||
dnw.writeln('<pre><code class="language-v">$hl_example</code></pre>')
|
||||
}
|
||||
dnw.writeln('</section>')
|
||||
}
|
||||
|
@ -494,3 +502,7 @@ fn write_toc(dn doc.DocNode, mut toc strings.Builder) {
|
|||
}
|
||||
toc.writeln('</li>')
|
||||
}
|
||||
|
||||
fn no_quotes(s string) string {
|
||||
return s.replace_each(no_quotes_replacement)
|
||||
}
|
||||
|
|
|
@ -1,7 +1,13 @@
|
|||
module main
|
||||
|
||||
const (
|
||||
source_root = 'temp'
|
||||
source_root = 'temp' // some const
|
||||
another = int(5) //
|
||||
)
|
||||
const (
|
||||
windowpos_undefined_mask = C.SDL_WINDOWPOS_UNDEFINED_MASK // 0x1FFF0000u
|
||||
windowpos_undefined = C.SDL_WINDOWPOS_UNDEFINED //
|
||||
)
|
||||
Used to indicate that you don't care what the window position is.
|
||||
fn funky()
|
||||
funky - comment for function below
|
||||
funky - comment for function below
|
||||
|
|
|
@ -1,6 +1,11 @@
|
|||
module main
|
||||
|
||||
const (
|
||||
source_root = 'temp'
|
||||
source_root = 'temp' // some const
|
||||
another = int(5) //
|
||||
)
|
||||
fn funky()
|
||||
const (
|
||||
windowpos_undefined_mask = C.SDL_WINDOWPOS_UNDEFINED_MASK // 0x1FFF0000u
|
||||
windowpos_undefined = C.SDL_WINDOWPOS_UNDEFINED //
|
||||
)
|
||||
fn funky()
|
||||
|
|
|
@ -1,5 +1,12 @@
|
|||
pub const (
|
||||
source_root = 'temp'
|
||||
source_root = 'temp' // some const
|
||||
another = int(5) //
|
||||
)
|
||||
|
||||
// Used to indicate that you don't care what the window position is.
|
||||
pub const (
|
||||
windowpos_undefined_mask = C.SDL_WINDOWPOS_UNDEFINED_MASK // 0x1FFF0000u
|
||||
windowpos_undefined = C.SDL_WINDOWPOS_UNDEFINED //
|
||||
)
|
||||
|
||||
// funky - comment for function below
|
||||
|
|
|
@ -16,7 +16,7 @@ fn find_diff_cmd() string {
|
|||
|
||||
fn test_vet() ? {
|
||||
os.setenv('VCOLORS', 'never', true)
|
||||
os.chdir(vroot) ?
|
||||
os.chdir(vroot)?
|
||||
test_dir := 'cmd/tools/vdoc/tests/testdata'
|
||||
main_files := get_main_files_in_dir(test_dir)
|
||||
fails := check_path(vexe, test_dir, main_files)
|
||||
|
|
|
@ -161,6 +161,13 @@ fn color_highlight(code string, tb &ast.Table) string {
|
|||
.char {
|
||||
lit = term.yellow('`$tok.lit`')
|
||||
}
|
||||
.comment {
|
||||
lit = if tok.lit != '' && tok.lit[0] == 1 {
|
||||
'//${tok.lit[1..]}'
|
||||
} else {
|
||||
'//$tok.lit'
|
||||
}
|
||||
}
|
||||
.keyword {
|
||||
lit = term.bright_blue(tok.lit)
|
||||
}
|
||||
|
@ -206,15 +213,18 @@ fn color_highlight(code string, tb &ast.Table) string {
|
|||
} else if
|
||||
next_tok.kind in [.lcbr, .rpar, .eof, .comma, .pipe, .name, .rcbr, .assign, .key_pub, .key_mut, .pipe, .comma]
|
||||
&& prev.kind in [.name, .amp, .rsbr, .key_type, .assign, .dot, .question, .rpar, .key_struct, .key_enum, .pipe, .key_interface]
|
||||
&& (tok.lit[0].ascii_str().is_upper() || prev_prev.lit in ['C', 'JS']) {
|
||||
&& ((tok.lit != '' && tok.lit[0].is_capital())
|
||||
|| prev_prev.lit in ['C', 'JS']) {
|
||||
tok_typ = .symbol
|
||||
} else if next_tok.kind in [.lpar, .lt] {
|
||||
} else if next_tok.kind == .lpar
|
||||
|| (!(tok.lit != '' && tok.lit[0].is_capital()) && next_tok.kind == .lt
|
||||
&& next_tok.pos == tok.pos + tok.lit.len) {
|
||||
tok_typ = .function
|
||||
} else if next_tok.kind == .dot {
|
||||
if tok.lit in ['C', 'JS'] {
|
||||
tok_typ = .prefix
|
||||
} else {
|
||||
if tok.lit[0].ascii_str().is_upper() {
|
||||
if tok.lit != '' && tok.lit[0].is_capital() {
|
||||
tok_typ = .symbol
|
||||
} else {
|
||||
tok_typ = .module_
|
||||
|
@ -241,7 +251,8 @@ fn color_highlight(code string, tb &ast.Table) string {
|
|||
.key_true, .key_false {
|
||||
tok_typ = .boolean
|
||||
}
|
||||
.lpar, .lcbr, .rpar, .rcbr, .lsbr, .rsbr, .semicolon, .colon, .comma, .dot {
|
||||
.lpar, .lcbr, .rpar, .rcbr, .lsbr, .rsbr, .semicolon, .colon, .comma, .dot,
|
||||
.dotdot, .ellipsis {
|
||||
tok_typ = .punctuation
|
||||
}
|
||||
.key_none {
|
||||
|
@ -251,7 +262,7 @@ fn color_highlight(code string, tb &ast.Table) string {
|
|||
if token.is_key(tok.lit) || token.is_decl(tok.kind) {
|
||||
tok_typ = .keyword
|
||||
} else if tok.kind == .decl_assign || tok.kind.is_assign() || tok.is_unary()
|
||||
|| tok.kind.is_relational() || tok.kind.is_infix() {
|
||||
|| tok.kind.is_relational() || tok.kind.is_infix() || tok.kind.is_postfix() {
|
||||
tok_typ = .operator
|
||||
}
|
||||
}
|
||||
|
@ -266,7 +277,7 @@ fn color_highlight(code string, tb &ast.Table) string {
|
|||
tok = next_tok
|
||||
next_tok = s.scan()
|
||||
} else {
|
||||
buf.write_byte(code[i])
|
||||
buf.write_u8(code[i])
|
||||
i++
|
||||
}
|
||||
}
|
||||
|
|
|
@ -104,13 +104,17 @@ fn (vd VDoc) gen_plaintext(d doc.Doc) string {
|
|||
d.head.merge_comments_without_examples()
|
||||
}
|
||||
if comments.trim_space().len > 0 {
|
||||
pw.writeln(comments.split_into_lines().map(' ' + it).join('\n'))
|
||||
pw.writeln(indent(comments))
|
||||
}
|
||||
}
|
||||
vd.write_plaintext_content(d.contents.arr(), mut pw)
|
||||
return pw.str()
|
||||
}
|
||||
|
||||
fn indent(s string) string {
|
||||
return ' ' + s.replace('\n', '\n ')
|
||||
}
|
||||
|
||||
fn (vd VDoc) write_plaintext_content(contents []doc.DocNode, mut pw strings.Builder) {
|
||||
cfg := vd.cfg
|
||||
for cn in contents {
|
||||
|
@ -121,12 +125,24 @@ fn (vd VDoc) write_plaintext_content(contents []doc.DocNode, mut pw strings.Buil
|
|||
pw.writeln(cn.content)
|
||||
}
|
||||
if cn.comments.len > 0 && cfg.include_comments {
|
||||
comments := if cfg.include_examples {
|
||||
cn.merge_comments()
|
||||
} else {
|
||||
cn.merge_comments_without_examples()
|
||||
comments := cn.merge_comments_without_examples()
|
||||
pw.writeln(indent(comments.trim_space()))
|
||||
if cfg.include_examples {
|
||||
examples := cn.examples()
|
||||
for ex in examples {
|
||||
pw.write_string(' Example: ')
|
||||
mut fex := ex
|
||||
if ex.index_u8(`\n`) >= 0 {
|
||||
// multi-line example
|
||||
pw.write_u8(`\n`)
|
||||
fex = indent(ex)
|
||||
}
|
||||
if cfg.is_color {
|
||||
fex = color_highlight(fex, vd.docs[0].table)
|
||||
}
|
||||
pw.writeln(fex)
|
||||
}
|
||||
}
|
||||
pw.writeln(comments.trim_space().split_into_lines().map(' ' + it).join('\n'))
|
||||
}
|
||||
if cfg.show_loc {
|
||||
pw.writeln('Location: $cn.file_path:${cn.pos.line_nr + 1}\n')
|
||||
|
@ -228,8 +244,8 @@ fn (vd VDoc) get_readme(path string) string {
|
|||
|
||||
fn (vd VDoc) emit_generate_err(err IError) {
|
||||
cfg := vd.cfg
|
||||
mut err_msg := err.msg
|
||||
if err.code == 1 {
|
||||
mut err_msg := err.msg()
|
||||
if err.code() == 1 {
|
||||
mod_list := get_modules_list(cfg.input_path, []string{})
|
||||
println('Available modules:\n==================')
|
||||
for mod in mod_list {
|
||||
|
@ -451,7 +467,7 @@ fn parse_arguments(args []string) Config {
|
|||
exit(1)
|
||||
}
|
||||
selected_platform := doc.platform_from_string(platform_str) or {
|
||||
eprintln(err.msg)
|
||||
eprintln(err.msg())
|
||||
exit(1)
|
||||
}
|
||||
cfg.platform = selected_platform
|
||||
|
|
|
@ -61,9 +61,7 @@ fn main() {
|
|||
if term_colors {
|
||||
os.setenv('VCOLORS', 'always', true)
|
||||
}
|
||||
if foptions.is_verbose {
|
||||
eprintln('vfmt foptions: $foptions')
|
||||
}
|
||||
foptions.vlog('vfmt foptions: $foptions')
|
||||
if foptions.is_worker {
|
||||
// -worker should be added by a parent vfmt process.
|
||||
// We launch a sub process for each file because
|
||||
|
@ -82,7 +80,7 @@ fn main() {
|
|||
eprintln('vfmt possible_files: ' + possible_files.str())
|
||||
}
|
||||
files := util.find_all_v_files(possible_files) or {
|
||||
verror(err.msg)
|
||||
verror(err.msg())
|
||||
return
|
||||
}
|
||||
if os.is_atty(0) == 0 && files.len == 0 {
|
||||
|
@ -109,9 +107,7 @@ fn main() {
|
|||
mut worker_command_array := cli_args_no_files.clone()
|
||||
worker_command_array << ['-worker', util.quote_path(fpath)]
|
||||
worker_cmd := worker_command_array.join(' ')
|
||||
if foptions.is_verbose {
|
||||
eprintln('vfmt worker_cmd: $worker_cmd')
|
||||
}
|
||||
foptions.vlog('vfmt worker_cmd: $worker_cmd')
|
||||
worker_result := os.execute(worker_cmd)
|
||||
// Guard against a possibly crashing worker process.
|
||||
if worker_result.exit_code != 0 {
|
||||
|
@ -151,43 +147,44 @@ fn main() {
|
|||
}
|
||||
}
|
||||
|
||||
fn (foptions &FormatOptions) format_file(file string) {
|
||||
fn setup_preferences_and_table() (&pref.Preferences, &ast.Table) {
|
||||
table := ast.new_table()
|
||||
mut prefs := pref.new_preferences()
|
||||
prefs.is_fmt = true
|
||||
prefs.skip_warnings = true
|
||||
return prefs, table
|
||||
}
|
||||
|
||||
fn (foptions &FormatOptions) vlog(msg string) {
|
||||
if foptions.is_verbose {
|
||||
eprintln('vfmt2 running fmt.fmt over file: $file')
|
||||
eprintln(msg)
|
||||
}
|
||||
table := ast.new_table()
|
||||
// checker := checker.new_checker(table, prefs)
|
||||
}
|
||||
|
||||
fn (foptions &FormatOptions) format_file(file string) {
|
||||
foptions.vlog('vfmt2 running fmt.fmt over file: $file')
|
||||
prefs, table := setup_preferences_and_table()
|
||||
file_ast := parser.parse_file(file, table, .parse_comments, prefs)
|
||||
// checker.check(file_ast)
|
||||
// checker.new_checker(table, prefs).check(file_ast)
|
||||
formatted_content := fmt.fmt(file_ast, table, prefs, foptions.is_debug)
|
||||
file_name := os.file_name(file)
|
||||
ulid := rand.ulid()
|
||||
vfmt_output_path := os.join_path(vtmp_folder, 'vfmt_${ulid}_$file_name')
|
||||
os.write_file(vfmt_output_path, formatted_content) or { panic(err) }
|
||||
if foptions.is_verbose {
|
||||
eprintln('fmt.fmt worked and $formatted_content.len bytes were written to $vfmt_output_path .')
|
||||
}
|
||||
foptions.vlog('fmt.fmt worked and $formatted_content.len bytes were written to $vfmt_output_path .')
|
||||
eprintln('$formatted_file_token$vfmt_output_path')
|
||||
}
|
||||
|
||||
fn (foptions &FormatOptions) format_pipe() {
|
||||
mut prefs := pref.new_preferences()
|
||||
prefs.is_fmt = true
|
||||
if foptions.is_verbose {
|
||||
eprintln('vfmt2 running fmt.fmt over stdin')
|
||||
}
|
||||
foptions.vlog('vfmt2 running fmt.fmt over stdin')
|
||||
prefs, table := setup_preferences_and_table()
|
||||
input_text := os.get_raw_lines_joined()
|
||||
table := ast.new_table()
|
||||
// checker := checker.new_checker(table, prefs)
|
||||
file_ast := parser.parse_text(input_text, '', table, .parse_comments, prefs)
|
||||
// checker.check(file_ast)
|
||||
// checker.new_checker(table, prefs).check(file_ast)
|
||||
formatted_content := fmt.fmt(file_ast, table, prefs, foptions.is_debug)
|
||||
print(formatted_content)
|
||||
if foptions.is_verbose {
|
||||
eprintln('fmt.fmt worked and $formatted_content.len bytes were written to stdout.')
|
||||
}
|
||||
flush_stdout()
|
||||
foptions.vlog('fmt.fmt worked and $formatted_content.len bytes were written to stdout.')
|
||||
}
|
||||
|
||||
fn print_compiler_options(compiler_params &pref.Preferences) {
|
||||
|
@ -234,9 +231,7 @@ fn (mut foptions FormatOptions) post_process_file(file string, formatted_file_pa
|
|||
return
|
||||
}
|
||||
diff_cmd := foptions.find_diff_cmd()
|
||||
if foptions.is_verbose {
|
||||
eprintln('Using diff command: $diff_cmd')
|
||||
}
|
||||
foptions.vlog('Using diff command: $diff_cmd')
|
||||
diff := diff.color_compare_files(diff_cmd, file, formatted_file_path)
|
||||
if diff.len > 0 {
|
||||
println(diff)
|
||||
|
@ -247,12 +242,8 @@ fn (mut foptions FormatOptions) post_process_file(file string, formatted_file_pa
|
|||
if !is_formatted_different {
|
||||
return
|
||||
}
|
||||
x := diff.color_compare_files(foptions.find_diff_cmd(), file, formatted_file_path)
|
||||
if x.len != 0 {
|
||||
println("$file is not vfmt'ed")
|
||||
return error('')
|
||||
}
|
||||
return
|
||||
println("$file is not vfmt'ed")
|
||||
return error('')
|
||||
}
|
||||
if foptions.is_c {
|
||||
if is_formatted_different {
|
||||
|
@ -289,6 +280,7 @@ fn (mut foptions FormatOptions) post_process_file(file string, formatted_file_pa
|
|||
return
|
||||
}
|
||||
print(formatted_fc)
|
||||
flush_stdout()
|
||||
}
|
||||
|
||||
fn (f FormatOptions) str() string {
|
||||
|
|
|
@ -37,7 +37,7 @@ import flag
|
|||
import toml
|
||||
|
||||
const (
|
||||
tool_name = os.file_name(os.executable())
|
||||
tool_name = 'vgret'
|
||||
tool_version = '0.0.1'
|
||||
tool_description = '\n Dump and/or compare rendered frames of `gg` based apps
|
||||
|
||||
|
@ -57,7 +57,7 @@ Examples:
|
|||
const (
|
||||
supported_hosts = ['linux']
|
||||
// External tool executables
|
||||
v_exe = vexe()
|
||||
v_exe = os.getenv('VEXE')
|
||||
idiff_exe = os.find_abs_path_of_executable('idiff') or { '' }
|
||||
)
|
||||
|
||||
|
@ -105,11 +105,27 @@ mut:
|
|||
config Config
|
||||
}
|
||||
|
||||
fn (opt Options) verbose_execute(cmd string) os.Result {
|
||||
opt.verbose_eprintln('Running `$cmd`')
|
||||
return os.execute(cmd)
|
||||
}
|
||||
|
||||
fn (opt Options) verbose_eprintln(msg string) {
|
||||
if opt.verbose {
|
||||
eprintln(msg)
|
||||
}
|
||||
}
|
||||
|
||||
fn main() {
|
||||
if os.args.len == 1 {
|
||||
println('Usage: $tool_name PATH \n$tool_description\n$tool_name -h for more help...')
|
||||
if runtime_os !in supported_hosts {
|
||||
eprintln('$tool_name is currently only supported on $supported_hosts hosts')
|
||||
exit(1)
|
||||
}
|
||||
if os.args.len == 1 {
|
||||
eprintln('Usage: $tool_name PATH \n$tool_description\n$tool_name -h for more help...')
|
||||
exit(1)
|
||||
}
|
||||
|
||||
mut fp := flag.new_flag_parser(os.args[1..])
|
||||
fp.application(tool_name)
|
||||
fp.version(tool_version)
|
||||
|
@ -131,22 +147,22 @@ fn main() {
|
|||
}
|
||||
|
||||
toml_conf := fp.string('toml-config', `t`, default_toml, 'Path or string with TOML configuration')
|
||||
|
||||
ensure_env(opt) or { panic(err) }
|
||||
|
||||
arg_paths := fp.finalize() or { panic(err) }
|
||||
|
||||
arg_paths := fp.finalize()?
|
||||
if arg_paths.len == 0 {
|
||||
println(fp.usage())
|
||||
println('\nError missing arguments')
|
||||
exit(1)
|
||||
}
|
||||
|
||||
opt.config = new_config(opt.root_path, toml_conf) ?
|
||||
if !os.exists(tmp_dir) {
|
||||
os.mkdir_all(tmp_dir)?
|
||||
}
|
||||
|
||||
opt.config = new_config(opt.root_path, toml_conf)?
|
||||
|
||||
gen_in_path := arg_paths[0]
|
||||
if arg_paths.len >= 1 {
|
||||
generate_screenshots(mut opt, gen_in_path) ?
|
||||
generate_screenshots(mut opt, gen_in_path)?
|
||||
}
|
||||
if arg_paths.len > 1 {
|
||||
target_path := arg_paths[1]
|
||||
|
@ -154,13 +170,15 @@ fn main() {
|
|||
all_paths_in_use := [path, gen_in_path, target_path]
|
||||
for path_in_use in all_paths_in_use {
|
||||
if !os.is_dir(path_in_use) {
|
||||
panic('`$path_in_use` is not a directory')
|
||||
eprintln('`$path_in_use` is not a directory')
|
||||
exit(1)
|
||||
}
|
||||
}
|
||||
if path == target_path || gen_in_path == target_path || gen_in_path == path {
|
||||
panic('Compare paths can not be the same directory `$path`/`$target_path`/`$gen_in_path`')
|
||||
eprintln('Compare paths can not be the same directory `$path`/`$target_path`/`$gen_in_path`')
|
||||
exit(1)
|
||||
}
|
||||
compare_screenshots(opt, gen_in_path, target_path) or { panic(err) }
|
||||
compare_screenshots(opt, gen_in_path, target_path)?
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -184,22 +202,16 @@ fn generate_screenshots(mut opt Options, output_path string) ? {
|
|||
rel_out_path = file
|
||||
}
|
||||
|
||||
if opt.verbose {
|
||||
eprintln('Compiling shaders (if needed) for `$file`')
|
||||
}
|
||||
sh_result := os.execute('${os.quoted_path(v_exe)} shader ${os.quoted_path(app_path)}')
|
||||
opt.verbose_eprintln('Compiling shaders (if needed) for `$file`')
|
||||
sh_result := opt.verbose_execute('${os.quoted_path(v_exe)} shader ${os.quoted_path(app_path)}')
|
||||
if sh_result.exit_code != 0 {
|
||||
if opt.verbose {
|
||||
eprintln('Skipping shader compile for `$file` v shader failed with:\n$sh_result.output')
|
||||
}
|
||||
opt.verbose_eprintln('Skipping shader compile for `$file` v shader failed with:\n$sh_result.output')
|
||||
continue
|
||||
}
|
||||
|
||||
if !os.exists(dst_path) {
|
||||
if opt.verbose {
|
||||
eprintln('Creating output path `$dst_path`')
|
||||
}
|
||||
os.mkdir_all(dst_path) ?
|
||||
opt.verbose_eprintln('Creating output path `$dst_path`')
|
||||
os.mkdir_all(dst_path)?
|
||||
}
|
||||
|
||||
screenshot_path := os.join_path(dst_path, rel_out_path)
|
||||
|
@ -211,7 +223,7 @@ fn generate_screenshots(mut opt Options, output_path string) ? {
|
|||
|
||||
app_config.screenshots_path = screenshot_path
|
||||
app_config.screenshots = take_screenshots(opt, app_config) or {
|
||||
return error('Failed taking screenshots of `$app_path`:\n$err.msg')
|
||||
return error('Failed taking screenshots of `$app_path`:\n$err.msg()')
|
||||
}
|
||||
}
|
||||
}
|
||||
|
@ -221,18 +233,13 @@ fn compare_screenshots(opt Options, output_path string, target_path string) ? {
|
|||
mut warns := map[string]string{}
|
||||
for app_config in opt.config.apps {
|
||||
screenshots := app_config.screenshots
|
||||
if opt.verbose {
|
||||
eprintln('Comparing $screenshots.len screenshots in `$output_path` with `$target_path`')
|
||||
}
|
||||
opt.verbose_eprintln('Comparing $screenshots.len screenshots in `$output_path` with `$target_path`')
|
||||
for screenshot in screenshots {
|
||||
relative_screenshot := screenshot.all_after(output_path + os.path_separator)
|
||||
|
||||
src := screenshot
|
||||
target := os.join_path(target_path, relative_screenshot)
|
||||
|
||||
if opt.verbose {
|
||||
eprintln('Comparing `$src` with `$target` with $app_config.compare.method')
|
||||
}
|
||||
opt.verbose_eprintln('Comparing `$src` with `$target` with $app_config.compare.method')
|
||||
|
||||
if app_config.compare.method == 'idiff' {
|
||||
if idiff_exe == '' {
|
||||
|
@ -242,14 +249,9 @@ fn compare_screenshots(opt Options, output_path string, target_path string) ? {
|
|||
'.diff.tif')
|
||||
flags := app_config.compare.flags.join(' ')
|
||||
diff_cmd := '${os.quoted_path(idiff_exe)} $flags -abs -od -o ${os.quoted_path(diff_file)} -abs ${os.quoted_path(src)} ${os.quoted_path(target)}'
|
||||
if opt.verbose {
|
||||
eprintln('Running: $diff_cmd')
|
||||
}
|
||||
|
||||
result := os.execute(diff_cmd)
|
||||
|
||||
if opt.verbose && result.exit_code == 0 {
|
||||
eprintln('OUTPUT: \n$result.output')
|
||||
result := opt.verbose_execute(diff_cmd)
|
||||
if result.exit_code == 0 {
|
||||
opt.verbose_eprintln('OUTPUT: \n$result.output')
|
||||
}
|
||||
if result.exit_code != 0 {
|
||||
eprintln('OUTPUT: \n$result.output')
|
||||
|
@ -278,15 +280,19 @@ fn compare_screenshots(opt Options, output_path string, target_path string) ? {
|
|||
}
|
||||
first := fails.keys()[0]
|
||||
fail_copy := os.join_path(os.temp_dir(), 'fail.' + first.all_after_last('.'))
|
||||
os.cp(first, fail_copy) or { panic(err) }
|
||||
os.cp(first, fail_copy)?
|
||||
eprintln('First failed file `$first` is copied to `$fail_copy`')
|
||||
|
||||
diff_file := os.join_path(os.temp_dir(), os.file_name(first).all_before_last('.') +
|
||||
'.diff.tif')
|
||||
diff_copy := os.join_path(os.temp_dir(), 'diff.tif')
|
||||
if os.is_file(diff_file) {
|
||||
os.cp(diff_file, diff_copy) or { panic(err) }
|
||||
os.cp(diff_file, diff_copy)?
|
||||
eprintln('First failed diff file `$diff_file` is copied to `$diff_copy`')
|
||||
eprintln('Removing alpha channel from $diff_copy ...')
|
||||
final_fail_result_file := os.join_path(os.temp_dir(), 'diff.png')
|
||||
opt.verbose_execute('convert ${os.quoted_path(diff_copy)} -alpha off ${os.quoted_path(final_fail_result_file)}')
|
||||
eprintln('Final diff file: `$final_fail_result_file`')
|
||||
}
|
||||
exit(1)
|
||||
}
|
||||
|
@ -295,25 +301,16 @@ fn compare_screenshots(opt Options, output_path string, target_path string) ? {
|
|||
fn take_screenshots(opt Options, app AppConfig) ?[]string {
|
||||
out_path := app.screenshots_path
|
||||
if !opt.compare_only {
|
||||
if opt.verbose {
|
||||
eprintln('Taking screenshot(s) of `$app.path` to `$out_path`')
|
||||
}
|
||||
|
||||
opt.verbose_eprintln('Taking screenshot(s) of `$app.path` to `$out_path`')
|
||||
if app.capture.method == 'gg_record' {
|
||||
for k, v in app.capture.env {
|
||||
rv := v.replace('\$OUT_PATH', out_path)
|
||||
if opt.verbose {
|
||||
eprintln('Setting ENV `$k` = $rv ...')
|
||||
}
|
||||
opt.verbose_eprintln('Setting ENV `$k` = $rv ...')
|
||||
os.setenv('$k', rv, true)
|
||||
}
|
||||
|
||||
mut flags := app.capture.flags.join(' ')
|
||||
v_cmd := '${os.quoted_path(v_exe)} $flags -d gg_record run ${os.quoted_path(app.abs_path)}'
|
||||
if opt.verbose {
|
||||
eprintln('Running `$v_cmd`')
|
||||
}
|
||||
result := os.execute('$v_cmd')
|
||||
result := opt.verbose_execute('${os.quoted_path(v_exe)} $flags -d gg_record run ${os.quoted_path(app.abs_path)}')
|
||||
if result.exit_code != 0 {
|
||||
return error('Failed taking screenshot of `$app.abs_path`:\n$result.output')
|
||||
}
|
||||
|
@ -329,35 +326,11 @@ fn take_screenshots(opt Options, app AppConfig) ?[]string {
|
|||
return screenshots
|
||||
}
|
||||
|
||||
// ensure_env returns nothing if everything is okay.
|
||||
fn ensure_env(opt Options) ? {
|
||||
if !os.exists(tmp_dir) {
|
||||
os.mkdir_all(tmp_dir) ?
|
||||
}
|
||||
|
||||
if runtime_os !in supported_hosts {
|
||||
return error('$tool_name is currently only supported on $supported_hosts hosts')
|
||||
}
|
||||
}
|
||||
|
||||
// vexe returns the absolute path to the V compiler.
|
||||
fn vexe() string {
|
||||
mut exe := os.getenv('VEXE')
|
||||
if os.is_executable(exe) {
|
||||
return os.real_path(exe)
|
||||
}
|
||||
possible_symlink := os.find_abs_path_of_executable('v') or { '' }
|
||||
if os.is_executable(possible_symlink) {
|
||||
exe = os.real_path(possible_symlink)
|
||||
}
|
||||
return exe
|
||||
}
|
||||
|
||||
fn new_config(root_path string, toml_config string) ?Config {
|
||||
doc := if os.is_file(toml_config) {
|
||||
toml.parse_file(toml_config) ?
|
||||
toml.parse_file(toml_config)?
|
||||
} else {
|
||||
toml.parse_text(toml_config) ?
|
||||
toml.parse_text(toml_config)?
|
||||
}
|
||||
|
||||
path := os.real_path(root_path).trim_right('/')
|
||||
|
|
|
@ -0,0 +1,292 @@
|
|||
// Copyright (c) 2020 Lars Pontoppidan. All rights reserved.
|
||||
// Use of this source code is governed by an MIT license
|
||||
// that can be found in the LICENSE file.
|
||||
import os
|
||||
import flag
|
||||
|
||||
const (
|
||||
tool_name = 'v missdoc'
|
||||
tool_version = '0.1.0'
|
||||
tool_description = 'Prints all V functions in .v files under PATH/, that do not yet have documentation comments.'
|
||||
work_dir_prefix = normalise_path(os.real_path(os.wd_at_startup) + os.path_separator)
|
||||
)
|
||||
|
||||
struct UndocumentedFN {
|
||||
file string
|
||||
line int
|
||||
signature string
|
||||
tags []string
|
||||
}
|
||||
|
||||
struct Options {
|
||||
show_help bool
|
||||
collect_tags bool
|
||||
deprecated bool
|
||||
private bool
|
||||
js bool
|
||||
no_line_numbers bool
|
||||
exclude []string
|
||||
relative_paths bool
|
||||
mut:
|
||||
verify bool
|
||||
diff bool
|
||||
additional_args []string
|
||||
}
|
||||
|
||||
fn (opt Options) collect_undocumented_functions_in_dir(directory string) []UndocumentedFN {
|
||||
mut files := []string{}
|
||||
collect(directory, mut files, fn (npath string, mut accumulated_paths []string) {
|
||||
if !npath.ends_with('.v') {
|
||||
return
|
||||
}
|
||||
if npath.ends_with('_test.v') {
|
||||
return
|
||||
}
|
||||
accumulated_paths << npath
|
||||
})
|
||||
mut undocumented_fns := []UndocumentedFN{}
|
||||
for file in files {
|
||||
if !opt.js && file.ends_with('.js.v') {
|
||||
continue
|
||||
}
|
||||
if opt.exclude.len > 0 && opt.exclude.any(file.contains(it)) {
|
||||
continue
|
||||
}
|
||||
undocumented_fns << opt.collect_undocumented_functions_in_file(file)
|
||||
}
|
||||
return undocumented_fns
|
||||
}
|
||||
|
||||
fn (opt &Options) collect_undocumented_functions_in_file(nfile string) []UndocumentedFN {
|
||||
file := os.real_path(nfile)
|
||||
contents := os.read_file(file) or { panic(err) }
|
||||
lines := contents.split('\n')
|
||||
mut list := []UndocumentedFN{}
|
||||
mut comments := []string{}
|
||||
mut tags := []string{}
|
||||
for i, line in lines {
|
||||
if line.starts_with('//') {
|
||||
comments << line
|
||||
} else if line.trim_space().starts_with('[') {
|
||||
tags << collect_tags(line)
|
||||
} else if line.starts_with('pub fn')
|
||||
|| (opt.private && (line.starts_with('fn ') && !(line.starts_with('fn C.')
|
||||
|| line.starts_with('fn main')))) {
|
||||
if comments.len == 0 {
|
||||
clean_line := line.all_before_last(' {')
|
||||
list << UndocumentedFN{
|
||||
line: i + 1
|
||||
signature: clean_line
|
||||
tags: tags
|
||||
file: file
|
||||
}
|
||||
}
|
||||
tags = []
|
||||
comments = []
|
||||
} else {
|
||||
tags = []
|
||||
comments = []
|
||||
}
|
||||
}
|
||||
return list
|
||||
}
|
||||
|
||||
fn (opt &Options) collect_undocumented_functions_in_path(path string) []UndocumentedFN {
|
||||
mut undocumented_functions := []UndocumentedFN{}
|
||||
if os.is_file(path) {
|
||||
undocumented_functions << opt.collect_undocumented_functions_in_file(path)
|
||||
} else {
|
||||
undocumented_functions << opt.collect_undocumented_functions_in_dir(path)
|
||||
}
|
||||
return undocumented_functions
|
||||
}
|
||||
|
||||
fn (opt &Options) report_undocumented_functions_in_path(path string) int {
|
||||
mut list := opt.collect_undocumented_functions_in_path(path)
|
||||
opt.report_undocumented_functions(list)
|
||||
return list.len
|
||||
}
|
||||
|
||||
fn (opt &Options) report_undocumented_functions(list []UndocumentedFN) {
|
||||
if list.len > 0 {
|
||||
for undocumented_fn in list {
|
||||
mut line_numbers := '$undocumented_fn.line:0:'
|
||||
if opt.no_line_numbers {
|
||||
line_numbers = ''
|
||||
}
|
||||
tags_str := if opt.collect_tags && undocumented_fn.tags.len > 0 {
|
||||
'$undocumented_fn.tags'
|
||||
} else {
|
||||
''
|
||||
}
|
||||
file := undocumented_fn.file
|
||||
ofile := if opt.relative_paths {
|
||||
file.replace(work_dir_prefix, '')
|
||||
} else {
|
||||
os.real_path(file)
|
||||
}
|
||||
if opt.deprecated {
|
||||
println('$ofile:$line_numbers$undocumented_fn.signature $tags_str')
|
||||
} else {
|
||||
mut has_deprecation_tag := false
|
||||
for tag in undocumented_fn.tags {
|
||||
if tag.starts_with('deprecated') {
|
||||
has_deprecation_tag = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !has_deprecation_tag {
|
||||
println('$ofile:$line_numbers$undocumented_fn.signature $tags_str')
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn (opt &Options) diff_undocumented_functions_in_paths(path_old string, path_new string) []UndocumentedFN {
|
||||
old := os.real_path(path_old)
|
||||
new := os.real_path(path_new)
|
||||
|
||||
mut old_undocumented_functions := opt.collect_undocumented_functions_in_path(old)
|
||||
mut new_undocumented_functions := opt.collect_undocumented_functions_in_path(new)
|
||||
|
||||
mut differs := []UndocumentedFN{}
|
||||
if new_undocumented_functions.len > old_undocumented_functions.len {
|
||||
for new_undoc_fn in new_undocumented_functions {
|
||||
new_relative_file := new_undoc_fn.file.replace(new, '').trim_string_left(os.path_separator)
|
||||
mut found := false
|
||||
for old_undoc_fn in old_undocumented_functions {
|
||||
old_relative_file := old_undoc_fn.file.replace(old, '').trim_string_left(os.path_separator)
|
||||
if new_relative_file == old_relative_file
|
||||
&& new_undoc_fn.signature == old_undoc_fn.signature {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !found {
|
||||
differs << new_undoc_fn
|
||||
}
|
||||
}
|
||||
}
|
||||
differs.sort_with_compare(sort_undoc_fns)
|
||||
return differs
|
||||
}
|
||||
|
||||
fn sort_undoc_fns(a &UndocumentedFN, b &UndocumentedFN) int {
|
||||
if a.file < b.file {
|
||||
return -1
|
||||
}
|
||||
if a.file > b.file {
|
||||
return 1
|
||||
}
|
||||
// same file sort by signature
|
||||
else {
|
||||
if a.signature < b.signature {
|
||||
return -1
|
||||
}
|
||||
if a.signature > b.signature {
|
||||
return 1
|
||||
}
|
||||
return 0
|
||||
}
|
||||
}
|
||||
|
||||
fn normalise_path(path string) string {
|
||||
return path.replace('\\', '/')
|
||||
}
|
||||
|
||||
fn collect(path string, mut l []string, f fn (string, mut []string)) {
|
||||
if !os.is_dir(path) {
|
||||
return
|
||||
}
|
||||
mut files := os.ls(path) or { return }
|
||||
for file in files {
|
||||
p := normalise_path(os.join_path_single(path, file))
|
||||
if os.is_dir(p) && !os.is_link(p) {
|
||||
collect(p, mut l, f)
|
||||
} else if os.exists(p) {
|
||||
f(p, mut l)
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
fn collect_tags(line string) []string {
|
||||
mut cleaned := line.all_before('/')
|
||||
cleaned = cleaned.replace_each(['[', '', ']', '', ' ', ''])
|
||||
return cleaned.split(',')
|
||||
}
|
||||
|
||||
fn main() {
|
||||
mut fp := flag.new_flag_parser(os.args[1..]) // skip the "v" command.
|
||||
fp.application(tool_name)
|
||||
fp.version(tool_version)
|
||||
fp.description(tool_description)
|
||||
fp.arguments_description('PATH [PATH]...')
|
||||
fp.skip_executable() // skip the "missdoc" command.
|
||||
|
||||
// Collect tool options
|
||||
mut opt := Options{
|
||||
show_help: fp.bool('help', `h`, false, 'Show this help text.')
|
||||
deprecated: fp.bool('deprecated', `d`, false, 'Include deprecated functions in output.')
|
||||
private: fp.bool('private', `p`, false, 'Include private functions in output.')
|
||||
js: fp.bool('js', 0, false, 'Include JavaScript functions in output.')
|
||||
no_line_numbers: fp.bool('no-line-numbers', `n`, false, 'Exclude line numbers in output.')
|
||||
collect_tags: fp.bool('tags', `t`, false, 'Also print function tags if any is found.')
|
||||
exclude: fp.string_multi('exclude', `e`, '')
|
||||
relative_paths: fp.bool('relative-paths', `r`, false, 'Use relative paths in output.')
|
||||
diff: fp.bool('diff', 0, false, 'exit(1) and show difference between two PATH inputs, return 0 otherwise.')
|
||||
verify: fp.bool('verify', 0, false, 'exit(1) if documentation is missing, 0 otherwise.')
|
||||
}
|
||||
|
||||
opt.additional_args = fp.finalize() or { panic(err) }
|
||||
|
||||
if opt.show_help {
|
||||
println(fp.usage())
|
||||
exit(0)
|
||||
}
|
||||
if opt.additional_args.len == 0 {
|
||||
println(fp.usage())
|
||||
eprintln('Error: $tool_name is missing PATH input')
|
||||
exit(1)
|
||||
}
|
||||
// Allow short-long versions to prevent false positive situations, should
|
||||
// the user miss a `-`. E.g.: the `-verify` flag would be ignored and missdoc
|
||||
// will return 0 for success plus a list of any undocumented functions.
|
||||
if '-verify' in opt.additional_args {
|
||||
opt.verify = true
|
||||
}
|
||||
if '-diff' in opt.additional_args {
|
||||
opt.diff = true
|
||||
}
|
||||
if opt.diff {
|
||||
if opt.additional_args.len < 2 {
|
||||
println(fp.usage())
|
||||
eprintln('Error: $tool_name --diff needs two valid PATH inputs')
|
||||
exit(1)
|
||||
}
|
||||
path_old := opt.additional_args[0]
|
||||
path_new := opt.additional_args[1]
|
||||
if !(os.is_file(path_old) || os.is_dir(path_old)) || !(os.is_file(path_new)
|
||||
|| os.is_dir(path_new)) {
|
||||
println(fp.usage())
|
||||
eprintln('Error: $tool_name --diff needs two valid PATH inputs')
|
||||
exit(1)
|
||||
}
|
||||
list := opt.diff_undocumented_functions_in_paths(path_old, path_new)
|
||||
if list.len > 0 {
|
||||
opt.report_undocumented_functions(list)
|
||||
exit(1)
|
||||
}
|
||||
exit(0)
|
||||
}
|
||||
mut total := 0
|
||||
for path in opt.additional_args {
|
||||
if os.is_file(path) || os.is_dir(path) {
|
||||
total += opt.report_undocumented_functions_in_path(path)
|
||||
}
|
||||
}
|
||||
if opt.verify && total > 0 {
|
||||
exit(1)
|
||||
}
|
||||
}
|
159
cmd/tools/vpm.v
159
cmd/tools/vpm.v
|
@ -4,6 +4,7 @@
|
|||
module main
|
||||
|
||||
import os
|
||||
import rand
|
||||
import os.cmdline
|
||||
import net.http
|
||||
import net.urllib
|
||||
|
@ -12,7 +13,8 @@ import vhelp
|
|||
import v.vmod
|
||||
|
||||
const (
|
||||
default_vpm_server_urls = ['https://vpm.vlang.io']
|
||||
default_vpm_server_urls = ['https://vpm.vlang.io', 'https://vpm.url4e.com']
|
||||
vpm_server_urls = rand.shuffle_clone(default_vpm_server_urls) or { [] } // ensure that all queries are distributed fairly
|
||||
valid_vpm_commands = ['help', 'search', 'install', 'update', 'upgrade', 'outdated',
|
||||
'list', 'remove', 'show']
|
||||
excluded_dirs = ['cache', 'vlib']
|
||||
|
@ -87,6 +89,12 @@ fn main() {
|
|||
module_names = manifest.dependencies
|
||||
}
|
||||
mut source := Source.vpm
|
||||
if '--once' in options {
|
||||
module_names = vpm_once_filter(module_names)
|
||||
if module_names.len == 0 {
|
||||
return
|
||||
}
|
||||
}
|
||||
if '--git' in options {
|
||||
source = Source.git
|
||||
}
|
||||
|
@ -202,24 +210,24 @@ fn vpm_install_from_vpm(module_names []string) {
|
|||
println('VPM needs `$vcs` to be installed.')
|
||||
continue
|
||||
}
|
||||
mod_name_as_path := mod.name.replace('.', os.path_separator).replace('-', '_').to_lower()
|
||||
final_module_path := os.real_path(os.join_path(settings.vmodules_path, mod_name_as_path))
|
||||
if os.exists(final_module_path) {
|
||||
//
|
||||
minfo := mod_name_info(mod.name)
|
||||
if os.exists(minfo.final_module_path) {
|
||||
vpm_update([name])
|
||||
continue
|
||||
}
|
||||
println('Installing module "$name" from "$mod.url" to "$final_module_path" ...')
|
||||
println('Installing module "$name" from "$mod.url" to "$minfo.final_module_path" ...')
|
||||
vcs_install_cmd := supported_vcs_install_cmds[vcs]
|
||||
cmd := '$vcs_install_cmd "$mod.url" "$final_module_path"'
|
||||
cmd := '$vcs_install_cmd "$mod.url" "$minfo.final_module_path"'
|
||||
verbose_println(' command: $cmd')
|
||||
cmdres := os.execute(cmd)
|
||||
if cmdres.exit_code != 0 {
|
||||
errors++
|
||||
println('Failed installing module "$name" to "$final_module_path" .')
|
||||
println('Failed installing module "$name" to "$minfo.final_module_path" .')
|
||||
print_failed_cmd(cmd, cmdres)
|
||||
continue
|
||||
}
|
||||
resolve_dependencies(name, final_module_path, module_names)
|
||||
resolve_dependencies(name, minfo.final_module_path, module_names)
|
||||
}
|
||||
if errors > 0 {
|
||||
exit(1)
|
||||
|
@ -264,7 +272,7 @@ fn vpm_install_from_vcs(module_names []string, vcs_key string) {
|
|||
}
|
||||
|
||||
repo_name := url.substr(second_cut_pos + 1, first_cut_pos)
|
||||
mut name := repo_name + os.path_separator + mod_name
|
||||
mut name := os.join_path(repo_name, mod_name)
|
||||
mod_name_as_path := name.replace('-', '_').to_lower()
|
||||
mut final_module_path := os.real_path(os.join_path(settings.vmodules_path, mod_name_as_path))
|
||||
if os.exists(final_module_path) {
|
||||
|
@ -291,20 +299,19 @@ fn vpm_install_from_vcs(module_names []string, vcs_key string) {
|
|||
if os.exists(vmod_path) {
|
||||
data := os.read_file(vmod_path) or { return }
|
||||
vmod := parse_vmod(data)
|
||||
mod_path := os.real_path(os.join_path(settings.vmodules_path, vmod.name.replace('.',
|
||||
os.path_separator)))
|
||||
println('Relocating module from "$name" to "$vmod.name" ( "$mod_path" ) ...')
|
||||
if os.exists(mod_path) {
|
||||
println('Warning module "$mod_path" already exsits!')
|
||||
println('Removing module "$mod_path" ...')
|
||||
os.rmdir_all(mod_path) or {
|
||||
minfo := mod_name_info(vmod.name)
|
||||
println('Relocating module from "$name" to "$vmod.name" ( "$minfo.final_module_path" ) ...')
|
||||
if os.exists(minfo.final_module_path) {
|
||||
println('Warning module "$minfo.final_module_path" already exsits!')
|
||||
println('Removing module "$minfo.final_module_path" ...')
|
||||
os.rmdir_all(minfo.final_module_path) or {
|
||||
errors++
|
||||
println('Errors while removing "$mod_path" :')
|
||||
println('Errors while removing "$minfo.final_module_path" :')
|
||||
println(err)
|
||||
continue
|
||||
}
|
||||
}
|
||||
os.mv(final_module_path, mod_path) or {
|
||||
os.mv(final_module_path, minfo.final_module_path) or {
|
||||
errors++
|
||||
println('Errors while relocating module "$name" :')
|
||||
println(err)
|
||||
|
@ -317,7 +324,7 @@ fn vpm_install_from_vcs(module_names []string, vcs_key string) {
|
|||
continue
|
||||
}
|
||||
println('Module "$name" relocated to "$vmod.name" successfully.')
|
||||
final_module_path = mod_path
|
||||
final_module_path = minfo.final_module_path
|
||||
name = vmod.name
|
||||
}
|
||||
resolve_dependencies(name, final_module_path, module_names)
|
||||
|
@ -327,6 +334,17 @@ fn vpm_install_from_vcs(module_names []string, vcs_key string) {
|
|||
}
|
||||
}
|
||||
|
||||
fn vpm_once_filter(module_names []string) []string {
|
||||
installed_modules := get_installed_modules()
|
||||
mut toinstall := []string{}
|
||||
for mn in module_names {
|
||||
if mn !in installed_modules {
|
||||
toinstall << mn
|
||||
}
|
||||
}
|
||||
return toinstall
|
||||
}
|
||||
|
||||
fn vpm_install(module_names []string, source Source) {
|
||||
if settings.is_help {
|
||||
vhelp.show_topic('install')
|
||||
|
@ -336,15 +354,16 @@ fn vpm_install(module_names []string, source Source) {
|
|||
println('´v install´ requires *at least one* module name.')
|
||||
exit(2)
|
||||
}
|
||||
|
||||
if source == .vpm {
|
||||
vpm_install_from_vpm(module_names)
|
||||
}
|
||||
if source == .git {
|
||||
vpm_install_from_vcs(module_names, 'git')
|
||||
}
|
||||
if source == .hg {
|
||||
vpm_install_from_vcs(module_names, 'hg')
|
||||
match source {
|
||||
.vpm {
|
||||
vpm_install_from_vpm(module_names)
|
||||
}
|
||||
.git {
|
||||
vpm_install_from_vcs(module_names, 'git')
|
||||
}
|
||||
.hg {
|
||||
vpm_install_from_vcs(module_names, 'hg')
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -359,10 +378,7 @@ fn vpm_update(m []string) {
|
|||
}
|
||||
mut errors := 0
|
||||
for modulename in module_names {
|
||||
mut zname := modulename
|
||||
if mod := get_mod_by_url(modulename) {
|
||||
zname = mod.name
|
||||
}
|
||||
zname := url_to_module_name(modulename)
|
||||
final_module_path := valid_final_path_of_existing_module(modulename) or { continue }
|
||||
os.chdir(final_module_path) or {}
|
||||
println('Updating module "$zname" in "$final_module_path" ...')
|
||||
|
@ -445,12 +461,11 @@ fn vpm_outdated() {
|
|||
fn vpm_list() {
|
||||
module_names := get_installed_modules()
|
||||
if module_names.len == 0 {
|
||||
println('You have no modules installed.')
|
||||
eprintln('You have no modules installed.')
|
||||
exit(0)
|
||||
}
|
||||
println('Installed modules:')
|
||||
for mod in module_names {
|
||||
println(' $mod')
|
||||
println(mod)
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -468,7 +483,7 @@ fn vpm_remove(module_names []string) {
|
|||
println('Removing module "$name" ...')
|
||||
verbose_println('removing folder $final_module_path')
|
||||
os.rmdir_all(final_module_path) or {
|
||||
verbose_println('error while removing "$final_module_path": $err.msg')
|
||||
verbose_println('error while removing "$final_module_path": $err.msg()')
|
||||
}
|
||||
// delete author directory if it is empty
|
||||
author := name.split('.')[0]
|
||||
|
@ -479,33 +494,28 @@ fn vpm_remove(module_names []string) {
|
|||
if os.is_dir_empty(author_dir) {
|
||||
verbose_println('removing author folder $author_dir')
|
||||
os.rmdir(author_dir) or {
|
||||
verbose_println('error while removing "$author_dir": $err.msg')
|
||||
verbose_println('error while removing "$author_dir": $err.msg()')
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn valid_final_path_of_existing_module(modulename string) ?string {
|
||||
mut name := modulename
|
||||
if mod := get_mod_by_url(name) {
|
||||
name = mod.name
|
||||
}
|
||||
mod_name_as_path := name.replace('.', os.path_separator).replace('-', '_').to_lower()
|
||||
name_of_vmodules_folder := os.join_path(settings.vmodules_path, mod_name_as_path)
|
||||
final_module_path := os.real_path(name_of_vmodules_folder)
|
||||
if !os.exists(final_module_path) {
|
||||
println('No module with name "$name" exists at $name_of_vmodules_folder')
|
||||
name := if mod := get_mod_by_url(modulename) { mod.name } else { modulename }
|
||||
minfo := mod_name_info(name)
|
||||
if !os.exists(minfo.final_module_path) {
|
||||
println('No module with name "$minfo.mname_normalised" exists at $minfo.final_module_path')
|
||||
return none
|
||||
}
|
||||
if !os.is_dir(final_module_path) {
|
||||
println('Skipping "$name_of_vmodules_folder", since it is not a folder.')
|
||||
if !os.is_dir(minfo.final_module_path) {
|
||||
println('Skipping "$minfo.final_module_path", since it is not a folder.')
|
||||
return none
|
||||
}
|
||||
vcs_used_in_dir(final_module_path) or {
|
||||
println('Skipping "$name_of_vmodules_folder", since it does not use a supported vcs.')
|
||||
vcs_used_in_dir(minfo.final_module_path) or {
|
||||
println('Skipping "$minfo.final_module_path", since it does not use a supported vcs.')
|
||||
return none
|
||||
}
|
||||
return final_module_path
|
||||
return minfo.final_module_path
|
||||
}
|
||||
|
||||
fn ensure_vmodules_dir_exist() {
|
||||
|
@ -556,6 +566,31 @@ fn get_installed_modules() []string {
|
|||
return modules
|
||||
}
|
||||
|
||||
struct ModNameInfo {
|
||||
mut:
|
||||
mname string // The-user.The-mod , *never* The-user.The-mod.git
|
||||
mname_normalised string // the_user.the_mod
|
||||
mname_as_path string // the_user/the_mod
|
||||
final_module_path string // ~/.vmodules/the_user/the_mod
|
||||
}
|
||||
|
||||
fn mod_name_info(mod_name string) ModNameInfo {
|
||||
mut info := ModNameInfo{}
|
||||
info.mname = if mod_name.ends_with('.git') { mod_name.replace('.git', '') } else { mod_name }
|
||||
info.mname_normalised = info.mname.replace('-', '_').to_lower()
|
||||
info.mname_as_path = info.mname_normalised.replace('.', os.path_separator)
|
||||
info.final_module_path = os.real_path(os.join_path(settings.vmodules_path, info.mname_as_path))
|
||||
return info
|
||||
}
|
||||
|
||||
fn url_to_module_name(modulename string) string {
|
||||
mut res := if mod := get_mod_by_url(modulename) { mod.name } else { modulename }
|
||||
if res.ends_with('.git') {
|
||||
res = res.replace('.git', '')
|
||||
}
|
||||
return res
|
||||
}
|
||||
|
||||
fn get_all_modules() []string {
|
||||
url := get_working_server_url()
|
||||
r := http.get(url) or { panic(err) }
|
||||
|
@ -563,21 +598,26 @@ fn get_all_modules() []string {
|
|||
println('Failed to search vpm.vlang.io. Status code: $r.status_code')
|
||||
exit(1)
|
||||
}
|
||||
s := r.text
|
||||
s := r.body
|
||||
mut read_len := 0
|
||||
mut modules := []string{}
|
||||
for read_len < s.len {
|
||||
mut start_token := '<a href="/mod'
|
||||
mut start_token := "<a href='/mod"
|
||||
end_token := '</a>'
|
||||
// get the start index of the module entry
|
||||
mut start_index := s.index_after(start_token, read_len)
|
||||
if start_index == -1 {
|
||||
break
|
||||
start_token = '<a href="/mod'
|
||||
start_index = s.index_after(start_token, read_len)
|
||||
if start_index == -1 {
|
||||
break
|
||||
}
|
||||
}
|
||||
// get the index of the end of anchor (a) opening tag
|
||||
// we use the previous start_index to make sure we are getting a module and not just a random 'a' tag
|
||||
start_token = '">'
|
||||
start_token = '>'
|
||||
start_index = s.index_after(start_token, start_index) + start_token.len
|
||||
|
||||
// get the index of the end of module entry
|
||||
end_index := s.index_after(end_token, start_index)
|
||||
if end_index == -1 {
|
||||
|
@ -626,7 +666,7 @@ fn get_working_server_url() string {
|
|||
server_urls := if settings.server_urls.len > 0 {
|
||||
settings.server_urls
|
||||
} else {
|
||||
default_vpm_server_urls
|
||||
vpm_server_urls
|
||||
}
|
||||
for url in server_urls {
|
||||
verbose_println('Trying server url: $url')
|
||||
|
@ -687,7 +727,8 @@ fn get_module_meta_info(name string) ?Mod {
|
|||
return mod
|
||||
}
|
||||
mut errors := []string{}
|
||||
for server_url in default_vpm_server_urls {
|
||||
|
||||
for server_url in vpm_server_urls {
|
||||
modurl := server_url + '/jsmod/$name'
|
||||
verbose_println('Retrieving module metadata from: "$modurl" ...')
|
||||
r := http.get(modurl) or {
|
||||
|
@ -695,7 +736,7 @@ fn get_module_meta_info(name string) ?Mod {
|
|||
errors << 'Error details: $err'
|
||||
continue
|
||||
}
|
||||
if r.status_code == 404 || r.text.trim_space() == '404' {
|
||||
if r.status_code == 404 || r.body.trim_space() == '404' {
|
||||
errors << 'Skipping module "$name", since "$server_url" reported that "$name" does not exist.'
|
||||
continue
|
||||
}
|
||||
|
@ -703,7 +744,7 @@ fn get_module_meta_info(name string) ?Mod {
|
|||
errors << 'Skipping module "$name", since "$server_url" responded with $r.status_code http status code. Please try again later.'
|
||||
continue
|
||||
}
|
||||
s := r.text
|
||||
s := r.body
|
||||
if s.len > 0 && s[0] != `{` {
|
||||
errors << 'Invalid json data'
|
||||
errors << s.trim_space().limit(100) + ' ...'
|
||||
|
|
|
@ -259,7 +259,7 @@ fn print_welcome_screen() {
|
|||
}
|
||||
}
|
||||
|
||||
fn run_repl(workdir string, vrepl_prefix string) {
|
||||
fn run_repl(workdir string, vrepl_prefix string) int {
|
||||
if !is_stdin_a_pipe {
|
||||
print_welcome_screen()
|
||||
}
|
||||
|
@ -297,6 +297,15 @@ fn run_repl(workdir string, vrepl_prefix string) {
|
|||
if line.len <= -1 || line == '' || line == 'exit' {
|
||||
break
|
||||
}
|
||||
if exit_pos := line.index('exit') {
|
||||
oparen := line[(exit_pos + 4)..].trim_space()
|
||||
if oparen.starts_with('(') {
|
||||
if closing := oparen.index(')') {
|
||||
rc := oparen[1..closing].parse_int(0, 8) or { panic(err) }
|
||||
return int(rc)
|
||||
}
|
||||
}
|
||||
}
|
||||
r.line = line
|
||||
if r.line == '\n' {
|
||||
continue
|
||||
|
@ -353,7 +362,7 @@ fn run_repl(workdir string, vrepl_prefix string) {
|
|||
if r.line.starts_with('print') {
|
||||
source_code := r.current_source_code(false, false) + '\n$r.line\n'
|
||||
os.write_file(file, source_code) or { panic(err) }
|
||||
s := repl_run_vfile(file) or { return }
|
||||
s := repl_run_vfile(file) or { return 1 }
|
||||
print_output(s)
|
||||
} else {
|
||||
mut temp_line := r.line
|
||||
|
@ -378,13 +387,13 @@ fn run_repl(workdir string, vrepl_prefix string) {
|
|||
'#include ',
|
||||
'for ',
|
||||
'or ',
|
||||
'insert',
|
||||
'delete',
|
||||
'prepend',
|
||||
'sort',
|
||||
'clear',
|
||||
'trim',
|
||||
'as',
|
||||
'insert(',
|
||||
'delete(',
|
||||
'prepend(',
|
||||
'sort(',
|
||||
'clear(',
|
||||
'trim(',
|
||||
' as ',
|
||||
]
|
||||
mut is_statement := false
|
||||
if filter_line.count('=') % 2 == 1 {
|
||||
|
@ -423,7 +432,7 @@ fn run_repl(workdir string, vrepl_prefix string) {
|
|||
temp_source_code = r.current_source_code(true, false) + '\n$temp_line\n'
|
||||
}
|
||||
os.write_file(temp_file, temp_source_code) or { panic(err) }
|
||||
s := repl_run_vfile(temp_file) or { return }
|
||||
s := repl_run_vfile(temp_file) or { return 1 }
|
||||
if !func_call && s.exit_code == 0 && !temp_flag {
|
||||
for r.temp_lines.len > 0 {
|
||||
if !r.temp_lines[0].starts_with('print') {
|
||||
|
@ -446,6 +455,7 @@ fn run_repl(workdir string, vrepl_prefix string) {
|
|||
print_output(s)
|
||||
}
|
||||
}
|
||||
return 0
|
||||
}
|
||||
|
||||
fn convert_output(os_result os.Result) string {
|
||||
|
@ -493,7 +503,7 @@ fn main() {
|
|||
if !is_stdin_a_pipe {
|
||||
os.setenv('VCOLORS', 'always', true)
|
||||
}
|
||||
run_repl(replfolder, replprefix)
|
||||
exit(run_repl(replfolder, replprefix))
|
||||
}
|
||||
|
||||
fn rerror(s string) {
|
||||
|
|
|
@ -12,11 +12,11 @@ fn main() {
|
|||
fp.version('0.0.1')
|
||||
fp.description('\nScan .v source files, and print the V tokens contained in them.')
|
||||
fp.arguments_description('PATH [PATH]...')
|
||||
fp.limit_free_args_to_at_least(1) ?
|
||||
fp.limit_free_args_to_at_least(1)?
|
||||
pref := pref.new_preferences()
|
||||
mut all_paths := fp.remaining_parameters()
|
||||
for path in all_paths {
|
||||
mut scanner := scanner.new_scanner_file(path, .parse_comments, pref) ?
|
||||
mut scanner := scanner.new_scanner_file(path, .parse_comments, pref)?
|
||||
mut tok := token.Token{}
|
||||
for tok.kind != .eof {
|
||||
tok = scanner.scan()
|
||||
|
|
|
@ -15,7 +15,7 @@ fn main() {
|
|||
short_v_name := vexe_name.all_before('.')
|
||||
//
|
||||
recompilation.must_be_enabled(vroot, 'Please install V from source, to use `$vexe_name self` .')
|
||||
os.chdir(vroot) ?
|
||||
os.chdir(vroot)?
|
||||
os.setenv('VCOLORS', 'always', true)
|
||||
args := os.args[1..].filter(it != 'self')
|
||||
jargs := args.join(' ')
|
||||
|
@ -30,7 +30,7 @@ fn main() {
|
|||
// The user just wants an independent copy of v, and so we are done.
|
||||
return
|
||||
}
|
||||
backup_old_version_and_rename_newer(short_v_name) or { panic(err.msg) }
|
||||
backup_old_version_and_rename_newer(short_v_name) or { panic(err.msg()) }
|
||||
println('V built successfully as executable "$vexe_name".')
|
||||
}
|
||||
|
||||
|
@ -71,17 +71,17 @@ fn backup_old_version_and_rename_newer(short_v_name string) ?bool {
|
|||
|
||||
list_folder(short_v_name, 'before:', 'removing $bak_file ...')
|
||||
if os.exists(bak_file) {
|
||||
os.rm(bak_file) or { errors << 'failed removing $bak_file: $err.msg' }
|
||||
os.rm(bak_file) or { errors << 'failed removing $bak_file: $err.msg()' }
|
||||
}
|
||||
|
||||
list_folder(short_v_name, '', 'moving $v_file to $bak_file ...')
|
||||
os.mv(v_file, bak_file) or { errors << err.msg }
|
||||
os.mv(v_file, bak_file) or { errors << err.msg() }
|
||||
|
||||
list_folder(short_v_name, '', 'removing $v_file ...')
|
||||
os.rm(v_file) or {}
|
||||
|
||||
list_folder(short_v_name, '', 'moving $v2_file to $v_file ...')
|
||||
os.mv_by_cp(v2_file, v_file) or { panic(err.msg) }
|
||||
os.mv_by_cp(v2_file, v_file) or { panic(err.msg()) }
|
||||
|
||||
list_folder(short_v_name, 'after:', '')
|
||||
|
||||
|
|
|
@ -11,7 +11,7 @@ fn main() {
|
|||
$if windows {
|
||||
println('Setup freetype...')
|
||||
vroot := os.dir(pref.vexe_path())
|
||||
os.chdir(vroot) ?
|
||||
os.chdir(vroot)?
|
||||
if os.is_dir(freetype_folder) {
|
||||
println('Thirdparty "freetype" is already installed.')
|
||||
} else {
|
||||
|
|
|
@ -172,7 +172,7 @@ fn compile_shaders(opt Options, input_path string) ? {
|
|||
// Currently sokol-shdc allows for multiple --input flags
|
||||
// - but it's only the last entry that's actually compiled/used
|
||||
// Given this fact - we can only compile one '.glsl' file to one C '.h' header
|
||||
compile_shader(co, shader_file) ?
|
||||
compile_shader(co, shader_file)?
|
||||
}
|
||||
}
|
||||
|
||||
|
@ -233,10 +233,10 @@ fn collect(path string, mut list []string) {
|
|||
// tools can be setup or is already in place.
|
||||
fn ensure_external_tools(opt Options) ? {
|
||||
if !os.exists(cache_dir) {
|
||||
os.mkdir_all(cache_dir) ?
|
||||
os.mkdir_all(cache_dir)?
|
||||
}
|
||||
if opt.force_update {
|
||||
download_shdc(opt) ?
|
||||
download_shdc(opt)?
|
||||
return
|
||||
}
|
||||
|
||||
|
@ -250,7 +250,7 @@ fn ensure_external_tools(opt Options) ? {
|
|||
return
|
||||
}
|
||||
|
||||
download_shdc(opt) ?
|
||||
download_shdc(opt)?
|
||||
}
|
||||
|
||||
// shdc_exe returns an absolute path to the `sokol-shdc` tool.
|
||||
|
@ -277,26 +277,26 @@ fn download_shdc(opt Options) ? {
|
|||
}
|
||||
}
|
||||
if os.exists(file) {
|
||||
os.rm(file) ?
|
||||
os.rm(file)?
|
||||
}
|
||||
|
||||
mut dtmp_file, dtmp_path := util.temp_file(util.TempFileOptions{ path: os.dir(file) }) ?
|
||||
mut dtmp_file, dtmp_path := util.temp_file(util.TempFileOptions{ path: os.dir(file) })?
|
||||
dtmp_file.close()
|
||||
if opt.verbose {
|
||||
eprintln('$tool_name downloading sokol-shdc from $download_url')
|
||||
}
|
||||
http.download_file(download_url, dtmp_path) or {
|
||||
os.rm(dtmp_path) ?
|
||||
os.rm(dtmp_path)?
|
||||
return error('$tool_name failed to download sokol-shdc needed for shader compiling: $err')
|
||||
}
|
||||
// Make it executable
|
||||
os.chmod(dtmp_path, 0o775) ?
|
||||
os.chmod(dtmp_path, 0o775)?
|
||||
// Move downloaded file in place
|
||||
os.mv(dtmp_path, file) ?
|
||||
os.mv(dtmp_path, file)?
|
||||
if runtime_os in ['linux', 'macos'] {
|
||||
// Use the .exe file ending to minimize platform friction.
|
||||
os.mv(file, shdc) ?
|
||||
os.mv(file, shdc)?
|
||||
}
|
||||
// Update internal version file
|
||||
os.write_file(shdc_version_file, update_to_shdc_version) ?
|
||||
os.write_file(shdc_version_file, update_to_shdc_version)?
|
||||
}
|
||||
|
|
|
@ -0,0 +1,45 @@
|
|||
import os
|
||||
|
||||
fn main() {
|
||||
mut files := []string{}
|
||||
args := os.args#[2..]
|
||||
for a in args {
|
||||
if os.is_file(a) {
|
||||
files << a
|
||||
continue
|
||||
}
|
||||
if os.is_dir(a) {
|
||||
files << os.walk_ext(a, '.v')
|
||||
continue
|
||||
}
|
||||
}
|
||||
files.sort()
|
||||
if files.len == 0 {
|
||||
println('0 .v files found.\n')
|
||||
println('Usage:')
|
||||
println(' v should-compile-all examples/ some/deep/file.v another/')
|
||||
println('... will try to compile all .v files found in the given folders and files, one by one.')
|
||||
println('If every single one of them compiles, the command will exit with an error code of 0.')
|
||||
println('If *any* of them *fail* to compile, the command will exit with an error code of 1.')
|
||||
println('')
|
||||
println('Note: this command is intended to be used in CI pipelines for v modules, like this:')
|
||||
println(' cd module/ ; v should-compile-all examples/ \n')
|
||||
exit(1)
|
||||
}
|
||||
mut failed_commands := []string{}
|
||||
for idx, example in files {
|
||||
cmd := '${os.quoted_path(@VEXE)} ${os.quoted_path(example)}'
|
||||
println('> compiling ${idx + 1:4}/${files.len:-4}: $cmd')
|
||||
if 0 != os.system(cmd) {
|
||||
failed_commands << cmd
|
||||
}
|
||||
}
|
||||
if failed_commands.len > 0 {
|
||||
for idx, fcmd in failed_commands {
|
||||
eprintln('>>> FAILED command ${idx + 1:4}/${failed_commands.len:-4}: $fcmd')
|
||||
}
|
||||
println('Summary: ${failed_commands.len:4}/${files.len:-4} file(s) failed to compile.')
|
||||
exit(1)
|
||||
}
|
||||
println('Summary: all $files.len file(s) compiled successfully.')
|
||||
}
|
|
@ -106,7 +106,7 @@ fn setup_symlink_windows(vexe string) {
|
|||
println('Symlink $vsymlink to $vexe created.')
|
||||
println('Checking system %PATH%...')
|
||||
reg_sys_env_handle := get_reg_sys_env_handle() or {
|
||||
warn_and_exit(err.msg)
|
||||
warn_and_exit(err.msg())
|
||||
return
|
||||
}
|
||||
// TODO: Fix defers inside ifs
|
||||
|
@ -133,7 +133,7 @@ fn setup_symlink_windows(vexe string) {
|
|||
println('Adding symlink directory to system %PATH%...')
|
||||
set_reg_value(reg_sys_env_handle, 'Path', new_sys_env_path) or {
|
||||
C.RegCloseKey(reg_sys_env_handle)
|
||||
warn_and_exit(err.msg)
|
||||
warn_and_exit(err.msg())
|
||||
}
|
||||
println('Done.')
|
||||
}
|
||||
|
|
|
@ -26,6 +26,7 @@ fn main() {
|
|||
spent := sw.elapsed().milliseconds()
|
||||
oks := commands.filter(it.ecode == 0)
|
||||
fails := commands.filter(it.ecode != 0)
|
||||
flush_stdout()
|
||||
println('')
|
||||
println(term.header_left(term_highlight('Summary of `v test-all`:'), '-'))
|
||||
println(term_highlight('Total runtime: $spent ms'))
|
||||
|
@ -37,6 +38,7 @@ fn main() {
|
|||
msg := if fcmd.errmsg != '' { fcmd.errmsg } else { fcmd.line }
|
||||
println(term.failed('> Failed:') + ' $msg')
|
||||
}
|
||||
flush_stdout()
|
||||
if fails.len > 0 {
|
||||
exit(1)
|
||||
}
|
||||
|
@ -49,17 +51,26 @@ enum RunCommandKind {
|
|||
|
||||
const expect_nothing = '<nothing>'
|
||||
|
||||
const starts_with_nothing = '<nothing>'
|
||||
|
||||
const ends_with_nothing = '<nothing>'
|
||||
|
||||
const contains_nothing = '<nothing>'
|
||||
|
||||
struct Command {
|
||||
mut:
|
||||
line string
|
||||
label string // when set, the label will be printed *before* cmd.line is executed
|
||||
ecode int
|
||||
okmsg string
|
||||
errmsg string
|
||||
rmfile string
|
||||
runcmd RunCommandKind = .system
|
||||
expect string = expect_nothing
|
||||
output string
|
||||
line string
|
||||
label string // when set, the label will be printed *before* cmd.line is executed
|
||||
ecode int
|
||||
okmsg string
|
||||
errmsg string
|
||||
rmfile string
|
||||
runcmd RunCommandKind = .system
|
||||
expect string = expect_nothing
|
||||
starts_with string = starts_with_nothing
|
||||
ends_with string = ends_with_nothing
|
||||
contains string = contains_nothing
|
||||
output string
|
||||
}
|
||||
|
||||
fn get_all_commands() []Command {
|
||||
|
@ -81,12 +92,32 @@ fn get_all_commands() []Command {
|
|||
runcmd: .execute
|
||||
expect: 'Hello, World!\n'
|
||||
}
|
||||
if os.getenv('V_CI_MUSL').len == 0 {
|
||||
for compiler_name in ['clang', 'gcc'] {
|
||||
if _ := os.find_abs_path_of_executable(compiler_name) {
|
||||
res << Command{
|
||||
line: '$vexe -cc $compiler_name -gc boehm run examples/hello_world.v'
|
||||
okmsg: '`v -cc $compiler_name -gc boehm run examples/hello_world.v` works'
|
||||
runcmd: .execute
|
||||
expect: 'Hello, World!\n'
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
res << Command{
|
||||
line: '$vexe interpret examples/hello_world.v'
|
||||
okmsg: 'V can interpret hello world.'
|
||||
runcmd: .execute
|
||||
expect: 'Hello, World!\n'
|
||||
}
|
||||
res << Command{
|
||||
line: '$vexe interpret examples/hanoi.v'
|
||||
okmsg: 'V can interpret hanoi.v'
|
||||
runcmd: .execute
|
||||
starts_with: 'Disc 1 from A to C...\n'
|
||||
ends_with: 'Disc 1 from A to C...\n'
|
||||
contains: 'Disc 7 from A to C...\n'
|
||||
}
|
||||
res << Command{
|
||||
line: '$vexe -o - examples/hello_world.v | grep "#define V_COMMIT_HASH" > /dev/null'
|
||||
okmsg: 'V prints the generated source code to stdout with `-o -` .'
|
||||
|
@ -210,9 +241,8 @@ fn get_all_commands() []Command {
|
|||
rmfile: 'examples/tetris/tetris'
|
||||
}
|
||||
$if macos || linux {
|
||||
ipath := '$vroot/thirdparty/stdatomic/nix'
|
||||
res << Command{
|
||||
line: '$vexe -o v.c cmd/v && cc -Werror -I ${os.quoted_path(ipath)} v.c -lpthread -lm && rm -rf a.out'
|
||||
line: '$vexe -o v.c cmd/v && cc -Werror v.c -lpthread -lm && rm -rf a.out'
|
||||
label: 'v.c should be buildable with no warnings...'
|
||||
okmsg: 'v.c can be compiled without warnings. This is good :)'
|
||||
rmfile: 'v.c'
|
||||
|
@ -241,23 +271,56 @@ fn (mut cmd Command) run() {
|
|||
spent := sw.elapsed().milliseconds()
|
||||
//
|
||||
mut is_failed := false
|
||||
mut is_failed_expected := false
|
||||
mut is_failed_starts_with := false
|
||||
mut is_failed_ends_with := false
|
||||
mut is_failed_contains := false
|
||||
if cmd.ecode != 0 {
|
||||
is_failed = true
|
||||
}
|
||||
if cmd.expect != expect_nothing {
|
||||
if cmd.output != cmd.expect {
|
||||
is_failed = true
|
||||
is_failed_expected = true
|
||||
}
|
||||
}
|
||||
if cmd.starts_with != starts_with_nothing {
|
||||
if !cmd.output.starts_with(cmd.starts_with) {
|
||||
is_failed = true
|
||||
is_failed_starts_with = true
|
||||
}
|
||||
}
|
||||
if cmd.ends_with != ends_with_nothing {
|
||||
if !cmd.output.ends_with(cmd.ends_with) {
|
||||
is_failed = true
|
||||
is_failed_ends_with = true
|
||||
}
|
||||
}
|
||||
if cmd.contains != contains_nothing {
|
||||
if !cmd.output.contains(cmd.contains) {
|
||||
is_failed = true
|
||||
is_failed_contains = true
|
||||
}
|
||||
}
|
||||
//
|
||||
run_label := if is_failed { term.failed('FAILED') } else { term_highlight('OK') }
|
||||
println('> Running: "$cmd.line" took: $spent ms ... $run_label')
|
||||
//
|
||||
if is_failed && cmd.expect != expect_nothing {
|
||||
if cmd.output != cmd.expect {
|
||||
eprintln('> expected:\n$cmd.expect')
|
||||
eprintln('> output:\n$cmd.output')
|
||||
}
|
||||
if is_failed && is_failed_expected {
|
||||
eprintln('> expected:\n$cmd.expect')
|
||||
eprintln('> output:\n$cmd.output')
|
||||
}
|
||||
if is_failed && is_failed_starts_with {
|
||||
eprintln('> expected to start with:\n$cmd.starts_with')
|
||||
eprintln('> output:\n${cmd.output#[..cmd.starts_with.len]}')
|
||||
}
|
||||
if is_failed && is_failed_ends_with {
|
||||
eprintln('> expected to end with:\n$cmd.ends_with')
|
||||
eprintln('> output:\n${cmd.output#[-cmd.starts_with.len..]}')
|
||||
}
|
||||
if is_failed && is_failed_contains {
|
||||
eprintln('> expected to contain:\n$cmd.contains')
|
||||
eprintln('> output:\n$cmd.output')
|
||||
}
|
||||
if vtest_nocleanup {
|
||||
return
|
||||
|
|
|
@ -5,52 +5,67 @@ import testing
|
|||
import v.util
|
||||
import arrays
|
||||
|
||||
const (
|
||||
vet_known_failing_exceptions = []string{}
|
||||
vet_folders = [
|
||||
'vlib/sqlite',
|
||||
'vlib/v',
|
||||
'vlib/x/json2',
|
||||
'vlib/x/ttf',
|
||||
'cmd/v',
|
||||
'cmd/tools',
|
||||
'examples/2048',
|
||||
'examples/tetris',
|
||||
'examples/term.ui',
|
||||
]
|
||||
verify_known_failing_exceptions = [
|
||||
// Handcrafted meaningful formatting of code parts (mostly arrays)
|
||||
'examples/sokol/02_cubes_glsl/cube_glsl.v',
|
||||
'examples/sokol/03_march_tracing_glsl/rt_glsl.v',
|
||||
'examples/sokol/04_multi_shader_glsl/rt_glsl.v',
|
||||
'examples/sokol/05_instancing_glsl/rt_glsl.v',
|
||||
'examples/sokol/06_obj_viewer/show_obj.v',
|
||||
'vlib/v/checker/tests/modules/deprecated_module/main.v' /* adds deprecated_module. module prefix to imports, even though the folder has v.mod */,
|
||||
'vlib/gg/m4/graphic.v',
|
||||
'vlib/gg/m4/m4_test.v',
|
||||
'vlib/gg/m4/matrix.v',
|
||||
'vlib/builtin/int_test.v' /* special number formatting that should be tested */,
|
||||
// TODOs and unfixed vfmt bugs
|
||||
'vlib/v/gen/js/tests/js.v', /* local `hello` fn, gets replaced with module `hello` aliased as `hl` */
|
||||
]
|
||||
vfmt_verify_list = [
|
||||
'cmd/',
|
||||
'examples/',
|
||||
'tutorials/',
|
||||
'vlib/',
|
||||
]
|
||||
vfmt_known_failing_exceptions = arrays.merge(verify_known_failing_exceptions, [
|
||||
'vlib/regex/regex_test.v' /* contains meaningfull formatting of the test case data */,
|
||||
'vlib/crypto/sha512/sha512block_generic.v' /* formatting of large constant arrays wraps to too many lines */,
|
||||
'vlib/crypto/aes/const.v' /* formatting of large constant arrays wraps to too many lines */,
|
||||
])
|
||||
)
|
||||
const vet_known_failing = [
|
||||
'do_not_delete_this',
|
||||
]
|
||||
|
||||
const (
|
||||
vexe = os.getenv('VEXE')
|
||||
vroot = os.dir(vexe)
|
||||
is_fix = '-fix' in os.args
|
||||
)
|
||||
const vet_known_failing_windows = [
|
||||
'do_not_delete_this',
|
||||
'vlib/v/gen/js/tests/testdata/byte_is_space.v',
|
||||
'vlib/v/gen/js/tests/testdata/compare_ints.v',
|
||||
'vlib/v/gen/js/tests/testdata/hw.v',
|
||||
'vlib/v/gen/js/tests/testdata/string_methods.v',
|
||||
'vlib/v/tests/project_with_modules_having_submodules/bin/main.vsh',
|
||||
'vlib/v/tests/valgrind/simple_interpolation_script_mode.v',
|
||||
'vlib/v/tests/valgrind/simple_interpolation_script_mode_more_scopes.v',
|
||||
]
|
||||
|
||||
const vet_folders = [
|
||||
'vlib/sqlite',
|
||||
'vlib/v',
|
||||
'vlib/x/json2',
|
||||
'vlib/x/ttf',
|
||||
'cmd/v',
|
||||
'cmd/tools',
|
||||
'examples/2048',
|
||||
'examples/tetris',
|
||||
'examples/term.ui',
|
||||
]
|
||||
|
||||
const verify_known_failing_exceptions = [
|
||||
// Handcrafted meaningful formatting of code parts (mostly arrays)
|
||||
'examples/sokol/02_cubes_glsl/cube_glsl.v',
|
||||
'examples/sokol/03_march_tracing_glsl/rt_glsl.v',
|
||||
'examples/sokol/04_multi_shader_glsl/rt_glsl.v',
|
||||
'examples/sokol/05_instancing_glsl/rt_glsl.v',
|
||||
'examples/sokol/06_obj_viewer/show_obj.v',
|
||||
'vlib/v/checker/tests/modules/deprecated_module/main.v' /* adds deprecated_module. module prefix to imports, even though the folder has v.mod */,
|
||||
'vlib/gg/m4/graphic.v',
|
||||
'vlib/gg/m4/m4_test.v',
|
||||
'vlib/gg/m4/matrix.v',
|
||||
'vlib/builtin/int_test.v' /* special number formatting that should be tested */,
|
||||
// TODOs and unfixed vfmt bugs
|
||||
'vlib/v/gen/js/tests/js.v', /* local `hello` fn, gets replaced with module `hello` aliased as `hl` */
|
||||
]
|
||||
|
||||
const vfmt_verify_list = [
|
||||
'cmd/',
|
||||
'examples/',
|
||||
'tutorials/',
|
||||
'vlib/',
|
||||
]
|
||||
|
||||
const vfmt_known_failing_exceptions = arrays.merge(verify_known_failing_exceptions, [
|
||||
'vlib/regex/regex_test.v' /* contains meaningfull formatting of the test case data */,
|
||||
'vlib/crypto/sha512/sha512block_generic.v' /* formatting of large constant arrays wraps to too many lines */,
|
||||
'vlib/crypto/aes/const.v' /* formatting of large constant arrays wraps to too many lines */,
|
||||
])
|
||||
|
||||
const vexe = os.getenv('VEXE')
|
||||
|
||||
const vroot = os.dir(vexe)
|
||||
|
||||
const is_fix = '-fix' in os.args
|
||||
|
||||
fn main() {
|
||||
args_string := os.args[1..].join(' ')
|
||||
|
@ -76,8 +91,12 @@ fn tsession(vargs string, tool_source string, tool_cmd string, tool_args string,
|
|||
|
||||
fn v_test_vetting(vargs string) {
|
||||
expanded_vet_list := util.find_all_v_files(vet_folders) or { return }
|
||||
mut vet_known_exceptions := vet_known_failing.clone()
|
||||
if os.user_os() == 'windows' {
|
||||
vet_known_exceptions << vet_known_failing_windows
|
||||
}
|
||||
vet_session := tsession(vargs, 'vvet', '${os.quoted_path(vexe)} vet', 'vet', expanded_vet_list,
|
||||
vet_known_failing_exceptions)
|
||||
vet_known_exceptions)
|
||||
//
|
||||
fmt_cmd, fmt_args := if is_fix {
|
||||
'${os.quoted_path(vexe)} fmt -w', 'fmt -w'
|
||||
|
|
|
@ -54,7 +54,7 @@ fn main() {
|
|||
context.pref = &pref.Preferences{
|
||||
output_mode: .silent
|
||||
}
|
||||
mut source := os.read_file(context.path) ?
|
||||
mut source := os.read_file(context.path)?
|
||||
source = source[..context.cut_index]
|
||||
|
||||
go fn (ms int) {
|
||||
|
@ -121,7 +121,7 @@ fn process_cli_args() &Context {
|
|||
exit(0)
|
||||
}
|
||||
context.all_paths = fp.finalize() or {
|
||||
context.error(err.msg)
|
||||
context.error(err.msg())
|
||||
exit(1)
|
||||
}
|
||||
if !context.is_worker && context.all_paths.len == 0 {
|
||||
|
|
|
@ -6,8 +6,86 @@ import v.pref
|
|||
|
||||
const github_job = os.getenv('GITHUB_JOB')
|
||||
|
||||
const just_essential = os.getenv('VTEST_JUST_ESSENTIAL') != ''
|
||||
|
||||
const (
|
||||
essential_list = [
|
||||
'cmd/tools/vvet/vet_test.v',
|
||||
'vlib/arrays/arrays_test.v',
|
||||
'vlib/bitfield/bitfield_test.v',
|
||||
//
|
||||
'vlib/builtin/int_test.v',
|
||||
'vlib/builtin/array_test.v',
|
||||
'vlib/builtin/float_test.v',
|
||||
'vlib/builtin/byte_test.v',
|
||||
'vlib/builtin/rune_test.v',
|
||||
'vlib/builtin/builtin_test.v',
|
||||
'vlib/builtin/map_of_floats_test.v',
|
||||
'vlib/builtin/string_int_test.v',
|
||||
'vlib/builtin/utf8_test.v',
|
||||
'vlib/builtin/map_test.v',
|
||||
'vlib/builtin/string_test.v',
|
||||
'vlib/builtin/sorting_test.v',
|
||||
'vlib/builtin/gated_array_string_test.v',
|
||||
'vlib/builtin/array_shrinkage_test.v',
|
||||
'vlib/builtin/isnil_test.v',
|
||||
'vlib/builtin/string_match_glob_test.v',
|
||||
'vlib/builtin/string_strip_margin_test.v',
|
||||
//
|
||||
'vlib/cli/command_test.v',
|
||||
'vlib/crypto/md5/md5_test.v',
|
||||
'vlib/dl/dl_test.v',
|
||||
'vlib/encoding/base64/base64_test.v',
|
||||
'vlib/encoding/utf8/encoding_utf8_test.v',
|
||||
'vlib/encoding/utf8/utf8_util_test.v',
|
||||
'vlib/flag/flag_test.v',
|
||||
'vlib/json/json_decode_test.v',
|
||||
'vlib/math/math_test.v',
|
||||
'vlib/net/tcp_test.v',
|
||||
'vlib/net/http/http_test.v',
|
||||
'vlib/net/http/server_test.v',
|
||||
'vlib/net/http/request_test.v',
|
||||
'vlib/io/io_test.v',
|
||||
'vlib/io/os_file_reader_test.v',
|
||||
'vlib/os/process_test.v',
|
||||
'vlib/os/file_test.v',
|
||||
'vlib/os/notify/notify_test.v',
|
||||
'vlib/os/filepath_test.v',
|
||||
'vlib/os/environment_test.v',
|
||||
'vlib/os/glob_test.v',
|
||||
'vlib/os/os_test.v',
|
||||
'vlib/rand/random_numbers_test.v',
|
||||
'vlib/rand/wyrand/wyrand_test.v',
|
||||
'vlib/runtime/runtime_test.v',
|
||||
'vlib/semver/semver_test.v',
|
||||
'vlib/sync/stdatomic/atomic_test.v',
|
||||
'vlib/sync/thread_test.v',
|
||||
'vlib/sync/waitgroup_test.v',
|
||||
'vlib/sync/pool/pool_test.v',
|
||||
'vlib/strings/builder_test.v',
|
||||
'vlib/strconv/atof_test.v',
|
||||
'vlib/strconv/atoi_test.v',
|
||||
'vlib/strconv/f32_f64_to_string_test.v',
|
||||
'vlib/strconv/format_test.v',
|
||||
'vlib/strconv/number_to_base_test.v',
|
||||
'vlib/time/time_test.v',
|
||||
'vlib/toml/tests/toml_test.v',
|
||||
'vlib/v/compiler_errors_test.v',
|
||||
'vlib/v/doc/doc_test.v',
|
||||
'vlib/v/eval/interpret_test.v',
|
||||
'vlib/v/fmt/fmt_keep_test.v',
|
||||
'vlib/v/fmt/fmt_test.v',
|
||||
'vlib/v/gen/c/coutput_test.v',
|
||||
'vlib/v/gen/js/program_test.v',
|
||||
'vlib/v/gen/native/macho_test.v',
|
||||
'vlib/v/gen/native/tests/native_test.v',
|
||||
'vlib/v/pkgconfig/pkgconfig_test.v',
|
||||
'vlib/v/tests/inout/compiler_test.v',
|
||||
'vlib/x/json2/json2_test.v',
|
||||
]
|
||||
skip_test_files = [
|
||||
'cmd/tools/vdoc/html_tag_escape_test.v', /* can't locate local module: markdown */
|
||||
'cmd/tools/vdoc/tests/vdoc_file_test.v', /* fails on Windows; order of output is not as expected */
|
||||
'vlib/context/onecontext/onecontext_test.v',
|
||||
'vlib/context/deadline_test.v' /* sometimes blocks */,
|
||||
'vlib/mysql/mysql_orm_test.v' /* mysql not installed */,
|
||||
|
@ -45,6 +123,7 @@ const (
|
|||
'vlib/sqlite/sqlite_orm_test.v',
|
||||
'vlib/v/tests/orm_sub_struct_test.v',
|
||||
'vlib/v/tests/orm_sub_array_struct_test.v',
|
||||
'vlib/v/tests/orm_joined_tables_select_test.v',
|
||||
'vlib/v/tests/sql_statement_inside_fn_call_test.v',
|
||||
'vlib/vweb/tests/vweb_test.v',
|
||||
'vlib/vweb/request_test.v',
|
||||
|
@ -63,6 +142,7 @@ const (
|
|||
]
|
||||
skip_with_werror = [
|
||||
'do_not_remove',
|
||||
'vlib/v/embed_file/tests/embed_file_test.v',
|
||||
]
|
||||
skip_with_asan_compiler = [
|
||||
'do_not_remove',
|
||||
|
@ -85,6 +165,7 @@ const (
|
|||
'vlib/orm/orm_test.v',
|
||||
'vlib/v/tests/orm_sub_struct_test.v',
|
||||
'vlib/v/tests/orm_sub_array_struct_test.v',
|
||||
'vlib/v/tests/orm_joined_tables_select_test.v',
|
||||
'vlib/v/tests/sql_statement_inside_fn_call_test.v',
|
||||
'vlib/clipboard/clipboard_test.v',
|
||||
'vlib/vweb/tests/vweb_test.v',
|
||||
|
@ -105,6 +186,10 @@ const (
|
|||
skip_on_non_linux = [
|
||||
'do_not_remove',
|
||||
]
|
||||
skip_on_windows_msvc = [
|
||||
'do_not_remove',
|
||||
'vlib/v/tests/const_fixed_array_containing_references_to_itself_test.v', // error C2099: initializer is not a constant
|
||||
]
|
||||
skip_on_windows = [
|
||||
'vlib/context/cancel_test.v',
|
||||
'vlib/context/deadline_test.v',
|
||||
|
@ -112,8 +197,7 @@ const (
|
|||
'vlib/context/value_test.v',
|
||||
'vlib/orm/orm_test.v',
|
||||
'vlib/v/tests/orm_sub_struct_test.v',
|
||||
'vlib/v/tests/closure_test.v',
|
||||
'vlib/v/tests/closure_generator_test.v',
|
||||
'vlib/v/tests/orm_joined_tables_select_test.v',
|
||||
'vlib/net/websocket/ws_test.v',
|
||||
'vlib/net/unix/unix_test.v',
|
||||
'vlib/net/unix/use_net_and_net_unix_together_test.v',
|
||||
|
@ -139,7 +223,6 @@ const (
|
|||
'do_not_remove',
|
||||
]
|
||||
skip_on_arm64 = [
|
||||
'vlib/v/tests/closure_generator_test.v',
|
||||
'do_not_remove',
|
||||
]
|
||||
skip_on_non_amd64_or_arm64 = [
|
||||
|
@ -167,8 +250,14 @@ fn main() {
|
|||
cmd_prefix := args_string.all_before('test-self')
|
||||
title := 'testing vlib'
|
||||
mut all_test_files := os.walk_ext(os.join_path(vroot, 'vlib'), '_test.v')
|
||||
all_test_files << os.walk_ext(os.join_path(vroot, 'cmd'), '_test.v')
|
||||
test_js_files := os.walk_ext(os.join_path(vroot, 'vlib'), '_test.js.v')
|
||||
all_test_files << test_js_files
|
||||
|
||||
if just_essential {
|
||||
rooted_essential_list := essential_list.map(os.join_path(vroot, it))
|
||||
all_test_files = all_test_files.filter(rooted_essential_list.contains(it))
|
||||
}
|
||||
testing.eheader(title)
|
||||
mut tsession := testing.new_test_session(cmd_prefix, true)
|
||||
tsession.files << all_test_files.filter(!it.contains('testdata' + os.path_separator))
|
||||
|
@ -261,6 +350,9 @@ fn main() {
|
|||
}
|
||||
$if windows {
|
||||
tsession.skip_files << skip_on_windows
|
||||
$if msvc {
|
||||
tsession.skip_files << skip_on_windows_msvc
|
||||
}
|
||||
}
|
||||
$if !windows {
|
||||
tsession.skip_files << skip_on_non_windows
|
||||
|
|
|
@ -69,7 +69,7 @@ fn main() {
|
|||
testing.header('Testing...')
|
||||
ts.test()
|
||||
println(ts.benchmark.total_message('all V _test.v files'))
|
||||
if ts.failed {
|
||||
if ts.failed_cmds.len > 0 {
|
||||
exit(1)
|
||||
}
|
||||
}
|
||||
|
|
|
@ -6,7 +6,7 @@ import v.pref
|
|||
fn main() {
|
||||
vexe := pref.vexe_path()
|
||||
vroot := os.dir(vexe)
|
||||
os.chdir(vroot) ?
|
||||
os.chdir(vroot)?
|
||||
os.setenv('VCOLORS', 'always', true)
|
||||
self_idx := os.args.index('tracev')
|
||||
args := os.args[1..self_idx]
|
||||
|
|
|
@ -26,7 +26,7 @@ fn new_app() App {
|
|||
fn main() {
|
||||
app := new_app()
|
||||
recompilation.must_be_enabled(app.vroot, 'Please install V from source, to use `v up` .')
|
||||
os.chdir(app.vroot) ?
|
||||
os.chdir(app.vroot)?
|
||||
println('Updating V...')
|
||||
app.update_from_master()
|
||||
v_hash := version.githash(false)
|
||||
|
@ -34,13 +34,19 @@ fn main() {
|
|||
// println(v_hash)
|
||||
// println(current_hash)
|
||||
if v_hash == current_hash {
|
||||
println('V is already updated.')
|
||||
app.show_current_v_version()
|
||||
return
|
||||
}
|
||||
$if windows {
|
||||
app.backup('cmd/tools/vup.exe')
|
||||
}
|
||||
app.recompile_v()
|
||||
if !app.recompile_v() {
|
||||
app.show_current_v_version()
|
||||
eprintln('Recompiling V *failed*.')
|
||||
eprintln('Try running `$get_make_cmd_name()` .')
|
||||
exit(1)
|
||||
}
|
||||
app.recompile_vup()
|
||||
app.show_current_v_version()
|
||||
}
|
||||
|
@ -66,7 +72,7 @@ fn (app App) update_from_master() {
|
|||
}
|
||||
}
|
||||
|
||||
fn (app App) recompile_v() {
|
||||
fn (app App) recompile_v() bool {
|
||||
// Note: app.vexe is more reliable than just v (which may be a symlink)
|
||||
opts := if app.is_prod { '-prod' } else { '' }
|
||||
vself := '${os.quoted_path(app.vexe)} $opts self'
|
||||
|
@ -74,35 +80,35 @@ fn (app App) recompile_v() {
|
|||
self_result := os.execute(vself)
|
||||
if self_result.exit_code == 0 {
|
||||
println(self_result.output.trim_space())
|
||||
return
|
||||
return true
|
||||
} else {
|
||||
app.vprintln('`$vself` failed, running `make`...')
|
||||
app.vprintln(self_result.output.trim_space())
|
||||
}
|
||||
app.make(vself)
|
||||
return app.make(vself)
|
||||
}
|
||||
|
||||
fn (app App) recompile_vup() {
|
||||
fn (app App) recompile_vup() bool {
|
||||
vup_result := os.execute('${os.quoted_path(app.vexe)} -g cmd/tools/vup.v')
|
||||
if vup_result.exit_code != 0 {
|
||||
eprintln('recompiling vup.v failed:')
|
||||
eprintln(vup_result.output)
|
||||
return false
|
||||
}
|
||||
return true
|
||||
}
|
||||
|
||||
fn (app App) make(vself string) {
|
||||
mut make := 'make'
|
||||
$if windows {
|
||||
make = 'make.bat'
|
||||
}
|
||||
fn (app App) make(vself string) bool {
|
||||
make := get_make_cmd_name()
|
||||
make_result := os.execute(make)
|
||||
if make_result.exit_code != 0 {
|
||||
eprintln('> $make failed:')
|
||||
eprintln('> make output:')
|
||||
eprintln(make_result.output)
|
||||
return
|
||||
return false
|
||||
}
|
||||
app.vprintln(make_result.output)
|
||||
return true
|
||||
}
|
||||
|
||||
fn (app App) show_current_v_version() {
|
||||
|
@ -116,17 +122,16 @@ fn (app App) show_current_v_version() {
|
|||
vversion += ', timestamp: ' + latest_v_commit_time.output.trim_space()
|
||||
}
|
||||
}
|
||||
println('Current V version:')
|
||||
println(vversion)
|
||||
println('Current V version: $vversion')
|
||||
}
|
||||
}
|
||||
|
||||
fn (app App) backup(file string) {
|
||||
backup_file := '${file}_old.exe'
|
||||
if os.exists(backup_file) {
|
||||
os.rm(backup_file) or { eprintln('failed removing $backup_file: $err.msg') }
|
||||
os.rm(backup_file) or { eprintln('failed removing $backup_file: $err.msg()') }
|
||||
}
|
||||
os.mv(file, backup_file) or { eprintln('failed moving $file: $err.msg') }
|
||||
os.mv(file, backup_file) or { eprintln('failed moving $file: $err.msg()') }
|
||||
}
|
||||
|
||||
fn (app App) git_command(command string) {
|
||||
|
@ -162,3 +167,11 @@ fn (app App) get_git() {
|
|||
eprintln("error: Install `git` using your system's package manager")
|
||||
}
|
||||
}
|
||||
|
||||
fn get_make_cmd_name() string {
|
||||
$if windows {
|
||||
return 'make.bat'
|
||||
} $else {
|
||||
return 'make'
|
||||
}
|
||||
}
|
||||
|
|
|
@ -0,0 +1,7 @@
|
|||
fn abc() int {
|
||||
return if true {
|
||||
0x4000 // 16KB
|
||||
} else {
|
||||
0x1000 // 4KB
|
||||
}
|
||||
}
|
|
@ -1,2 +1,2 @@
|
|||
cmd/tools/vvet/tests/array_init_one_val.vv:2: error: Use `var == value` instead of `var in [value]`
|
||||
NB: You can run `v fmt -w file.v` to fix these errors automatically
|
||||
Note: You can run `v fmt -w file.v` to fix these errors automatically
|
||||
|
|
|
@ -3,4 +3,4 @@ cmd/tools/vvet/tests/indent_with_space.vv:10: error: Looks like you are using sp
|
|||
cmd/tools/vvet/tests/indent_with_space.vv:17: error: Looks like you are using spaces for indentation.
|
||||
cmd/tools/vvet/tests/indent_with_space.vv:20: error: Looks like you are using spaces for indentation.
|
||||
cmd/tools/vvet/tests/indent_with_space.vv:22: error: Looks like you are using spaces for indentation.
|
||||
NB: You can run `v fmt -w file.v` to fix these errors automatically
|
||||
Note: You can run `v fmt -w file.v` to fix these errors automatically
|
||||
|
|
|
@ -0,0 +1,6 @@
|
|||
// Some header comment
|
||||
|
||||
// read_response is a carefully constructed comment.
|
||||
// read_response_body. <-- this would earlier trigger a false
|
||||
// postive.
|
||||
pub fn read_response() ?(string, string) {}
|
|
@ -1,2 +1,2 @@
|
|||
cmd/tools/vvet/tests/parens_space_a.vv:1: error: Looks like you are adding a space after `(`
|
||||
NB: You can run `v fmt -w file.v` to fix these errors automatically
|
||||
Note: You can run `v fmt -w file.v` to fix these errors automatically
|
||||
|
|
|
@ -1,2 +1,2 @@
|
|||
cmd/tools/vvet/tests/parens_space_b.vv:1: error: Looks like you are adding a space before `)`
|
||||
NB: You can run `v fmt -w file.v` to fix these errors automatically
|
||||
Note: You can run `v fmt -w file.v` to fix these errors automatically
|
||||
|
|
|
@ -14,7 +14,7 @@ fn find_diff_cmd() string {
|
|||
fn test_vet() ? {
|
||||
vexe := os.getenv('VEXE')
|
||||
vroot := os.dir(vexe)
|
||||
os.chdir(vroot) ?
|
||||
os.chdir(vroot)?
|
||||
test_dir := 'cmd/tools/vvet/tests'
|
||||
tests := get_tests_in_dir(test_dir)
|
||||
fails := check_path(vexe, test_dir, tests)
|
||||
|
|
|
@ -20,11 +20,12 @@ mut:
|
|||
}
|
||||
|
||||
struct Options {
|
||||
is_force bool
|
||||
is_werror bool
|
||||
is_verbose bool
|
||||
show_warnings bool
|
||||
use_color bool
|
||||
is_force bool
|
||||
is_werror bool
|
||||
is_verbose bool
|
||||
show_warnings bool
|
||||
use_color bool
|
||||
doc_private_fns_too bool
|
||||
}
|
||||
|
||||
const term_colors = term.can_show_color_on_stderr()
|
||||
|
@ -38,6 +39,7 @@ fn main() {
|
|||
is_verbose: '-verbose' in vet_options || '-v' in vet_options
|
||||
show_warnings: '-hide-warnings' !in vet_options && '-w' !in vet_options
|
||||
use_color: '-color' in vet_options || (term_colors && '-nocolor' !in vet_options)
|
||||
doc_private_fns_too: '-p' in vet_options
|
||||
}
|
||||
}
|
||||
mut paths := cmdline.only_non_options(vet_options)
|
||||
|
@ -110,92 +112,104 @@ fn (mut vt Vet) vet_file(path string) {
|
|||
|
||||
// vet_line vets the contents of `line` from `vet.file`.
|
||||
fn (mut vt Vet) vet_line(lines []string, line string, lnumber int) {
|
||||
// Vet public functions
|
||||
if line.starts_with('pub fn') || (line.starts_with('fn ') && !(line.starts_with('fn C.')
|
||||
|| line.starts_with('fn main'))) {
|
||||
// Scan function declarations for missing documentation
|
||||
is_pub_fn := line.starts_with('pub fn')
|
||||
if lnumber > 0 {
|
||||
collect_tags := fn (line string) []string {
|
||||
mut cleaned := line.all_before('/')
|
||||
cleaned = cleaned.replace_each(['[', '', ']', '', ' ', ''])
|
||||
return cleaned.split(',')
|
||||
}
|
||||
ident_fn_name := fn (line string) string {
|
||||
mut fn_idx := line.index(' fn ') or { return '' }
|
||||
if line.len < fn_idx + 5 {
|
||||
return ''
|
||||
}
|
||||
mut tokens := line[fn_idx + 4..].split(' ')
|
||||
// Skip struct identifier
|
||||
if tokens.first().starts_with('(') {
|
||||
fn_idx = line.index(')') or { return '' }
|
||||
tokens = line[fn_idx..].split(' ')
|
||||
if tokens.len > 1 {
|
||||
tokens = [tokens[1]]
|
||||
}
|
||||
}
|
||||
if tokens.len > 0 {
|
||||
return tokens[0].all_before('(')
|
||||
}
|
||||
vt.vet_fn_documentation(lines, line, lnumber)
|
||||
}
|
||||
|
||||
// vet_fn_documentation ensures that functions are documented
|
||||
fn (mut vt Vet) vet_fn_documentation(lines []string, line string, lnumber int) {
|
||||
if line.starts_with('fn C.') {
|
||||
return
|
||||
}
|
||||
is_pub_fn := line.starts_with('pub fn ')
|
||||
is_fn := is_pub_fn || line.starts_with('fn ')
|
||||
if !is_fn {
|
||||
return
|
||||
}
|
||||
if line.starts_with('fn main') {
|
||||
return
|
||||
}
|
||||
if !(is_pub_fn || vt.opt.doc_private_fns_too) {
|
||||
return
|
||||
}
|
||||
// Scan function declarations for missing documentation
|
||||
if lnumber > 0 {
|
||||
collect_tags := fn (line string) []string {
|
||||
mut cleaned := line.all_before('/')
|
||||
cleaned = cleaned.replace_each(['[', '', ']', '', ' ', ''])
|
||||
return cleaned.split(',')
|
||||
}
|
||||
ident_fn_name := fn (line string) string {
|
||||
mut fn_idx := line.index(' fn ') or { return '' }
|
||||
if line.len < fn_idx + 5 {
|
||||
return ''
|
||||
}
|
||||
mut line_above := lines[lnumber - 1]
|
||||
mut tags := []string{}
|
||||
if !line_above.starts_with('//') {
|
||||
mut grab := true
|
||||
for j := lnumber - 1; j >= 0; j-- {
|
||||
prev_line := lines[j]
|
||||
if prev_line.contains('}') { // We've looked back to the above scope, stop here
|
||||
break
|
||||
} else if prev_line.starts_with('[') {
|
||||
tags << collect_tags(prev_line)
|
||||
continue
|
||||
} else if prev_line.starts_with('//') { // Single-line comment
|
||||
grab = false
|
||||
break
|
||||
}
|
||||
mut tokens := line[fn_idx + 4..].split(' ')
|
||||
// Skip struct identifier
|
||||
if tokens.first().starts_with('(') {
|
||||
fn_idx = line.index(')') or { return '' }
|
||||
tokens = line[fn_idx..].split(' ')
|
||||
if tokens.len > 1 {
|
||||
tokens = [tokens[1]]
|
||||
}
|
||||
if grab {
|
||||
}
|
||||
if tokens.len > 0 {
|
||||
return tokens[0].all_before('(')
|
||||
}
|
||||
return ''
|
||||
}
|
||||
mut line_above := lines[lnumber - 1]
|
||||
mut tags := []string{}
|
||||
if !line_above.starts_with('//') {
|
||||
mut grab := true
|
||||
for j := lnumber - 1; j >= 0; j-- {
|
||||
prev_line := lines[j]
|
||||
if prev_line.contains('}') { // We've looked back to the above scope, stop here
|
||||
break
|
||||
} else if prev_line.starts_with('[') {
|
||||
tags << collect_tags(prev_line)
|
||||
continue
|
||||
} else if prev_line.starts_with('//') { // Single-line comment
|
||||
grab = false
|
||||
break
|
||||
}
|
||||
}
|
||||
if grab {
|
||||
clean_line := line.all_before_last('{').trim(' ')
|
||||
vt.warn('Function documentation seems to be missing for "$clean_line".',
|
||||
lnumber, .doc)
|
||||
}
|
||||
} else {
|
||||
fn_name := ident_fn_name(line)
|
||||
mut grab := true
|
||||
for j := lnumber - 1; j >= 0; j-- {
|
||||
mut prev_prev_line := ''
|
||||
if j - 1 >= 0 {
|
||||
prev_prev_line = lines[j - 1]
|
||||
}
|
||||
prev_line := lines[j]
|
||||
if prev_line.contains('}') { // We've looked back to the above scope, stop here
|
||||
break
|
||||
} else if prev_line.starts_with('// $fn_name ') {
|
||||
grab = false
|
||||
break
|
||||
} else if prev_line.starts_with('// $fn_name') && !prev_prev_line.starts_with('//') {
|
||||
grab = false
|
||||
clean_line := line.all_before_last('{').trim(' ')
|
||||
if is_pub_fn {
|
||||
vt.warn('Function documentation seems to be missing for "$clean_line".',
|
||||
lnumber, .doc)
|
||||
}
|
||||
}
|
||||
} else {
|
||||
fn_name := ident_fn_name(line)
|
||||
mut grab := true
|
||||
for j := lnumber - 1; j >= 0; j-- {
|
||||
prev_line := lines[j]
|
||||
if prev_line.contains('}') { // We've looked back to the above scope, stop here
|
||||
break
|
||||
} else if prev_line.starts_with('// $fn_name ') {
|
||||
grab = false
|
||||
break
|
||||
} else if prev_line.starts_with('// $fn_name') {
|
||||
grab = false
|
||||
if is_pub_fn {
|
||||
clean_line := line.all_before_last('{').trim(' ')
|
||||
vt.warn('The documentation for "$clean_line" seems incomplete.',
|
||||
lnumber, .doc)
|
||||
}
|
||||
break
|
||||
} else if prev_line.starts_with('[') {
|
||||
tags << collect_tags(prev_line)
|
||||
continue
|
||||
} else if prev_line.starts_with('//') { // Single-line comment
|
||||
continue
|
||||
}
|
||||
}
|
||||
if grab {
|
||||
clean_line := line.all_before_last('{').trim(' ')
|
||||
if is_pub_fn {
|
||||
vt.warn('A function name is missing from the documentation of "$clean_line".',
|
||||
lnumber, .doc)
|
||||
}
|
||||
vt.warn('The documentation for "$clean_line" seems incomplete.', lnumber,
|
||||
.doc)
|
||||
break
|
||||
} else if prev_line.starts_with('[') {
|
||||
tags << collect_tags(prev_line)
|
||||
continue
|
||||
} else if prev_line.starts_with('//') { // Single-line comment
|
||||
continue
|
||||
}
|
||||
}
|
||||
if grab {
|
||||
clean_line := line.all_before_last('{').trim(' ')
|
||||
vt.warn('A function name is missing from the documentation of "$clean_line".',
|
||||
lnumber, .doc)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
@ -89,6 +89,7 @@ mut:
|
|||
v_cycles int // how many times the worker has restarted the V compiler
|
||||
scan_cycles int // how many times the worker has scanned for source file changes
|
||||
clear_terminal bool // whether to clear the terminal before each re-run
|
||||
keep_running bool // when true, re-run the program automatically if it exits on its own. Useful for gg apps.
|
||||
silent bool // when true, watch will not print a timestamp line before each re-run
|
||||
add_files []string // path to additional files that have to be watched for changes
|
||||
ignore_exts []string // extensions of files that will be ignored, even if they change (useful for sqlite.db files for example)
|
||||
|
@ -207,7 +208,7 @@ fn change_detection_loop(ocontext &Context) {
|
|||
}
|
||||
|
||||
fn (mut context Context) kill_pgroup() {
|
||||
if context.child_process == 0 {
|
||||
if unsafe { context.child_process == 0 } {
|
||||
return
|
||||
}
|
||||
if context.child_process.is_alive() {
|
||||
|
@ -260,6 +261,9 @@ fn (mut context Context) compilation_runner_loop() {
|
|||
if notalive_count == 1 {
|
||||
// a short lived process finished, do cleanup:
|
||||
context.run_after_cmd()
|
||||
if context.keep_running {
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
select {
|
||||
|
@ -282,6 +286,7 @@ fn (mut context Context) compilation_runner_loop() {
|
|||
}
|
||||
}
|
||||
if !context.child_process.is_alive() {
|
||||
context.elog('> child_process is no longer alive | notalive_count: $notalive_count')
|
||||
context.child_process.wait()
|
||||
context.child_process.close()
|
||||
if notalive_count == 0 {
|
||||
|
@ -313,10 +318,11 @@ fn main() {
|
|||
fp.description('Collect all .v files needed for a compilation, then re-run the compilation when any of the source changes.')
|
||||
fp.arguments_description('[--silent] [--clear] [--ignore .db] [--add /path/to/a/file.v] [run] program.v')
|
||||
fp.allow_unknown_args()
|
||||
fp.limit_free_args_to_at_least(1) ?
|
||||
fp.limit_free_args_to_at_least(1)?
|
||||
context.is_worker = fp.bool('vwatchworker', 0, false, 'Internal flag. Used to distinguish vwatch manager and worker processes.')
|
||||
context.silent = fp.bool('silent', `s`, false, 'Be more silent; do not print the watch timestamp before each re-run.')
|
||||
context.clear_terminal = fp.bool('clear', `c`, false, 'Clears the terminal before each re-run.')
|
||||
context.keep_running = fp.bool('keep', `k`, false, 'Keep the program running. Restart it automatically, if it exits by itself. Useful for gg/ui apps.')
|
||||
context.add_files = fp.string('add', `a`, '', 'Add more files to be watched. Useful with `v watch -add=/tmp/feature.v run cmd/v /tmp/feature.v`, if you change *both* the compiler, and the feature.v file.').split(',')
|
||||
context.ignore_exts = fp.string('ignore', `i`, '', 'Ignore files having these extensions. Useful with `v watch -ignore=.db run server.v`, if your server writes to an sqlite.db file in the same folder.').split(',')
|
||||
show_help := fp.bool('help', `h`, false, 'Show this help screen.')
|
||||
|
|
|
@ -25,7 +25,20 @@ see also `v help build`.
|
|||
|
||||
-cstrict
|
||||
Turn on additional C warnings. This slows down compilation
|
||||
slightly (~10% for gcc), but sometimes provides better diagnosis.
|
||||
slightly (~10% for gcc), but sometimes provides better error diagnosis.
|
||||
|
||||
-cmain <MainFunctionName>
|
||||
Useful with framework like code, that uses macros to re-define `main`, like SDL2 does for example.
|
||||
With that option, V will always generate:
|
||||
`int MainFunctionName(int ___argc, char** ___argv) {` , for the program entry point function, *no matter* the OS.
|
||||
Without it, on non Windows systems, it will generate:
|
||||
`int main(int ___argc, char** ___argv) {`
|
||||
... and on Windows, it will generate:
|
||||
a) `int WINAPI wWinMain(HINSTANCE instance, HINSTANCE prev_instance, LPWSTR cmd_line, int show_cmd){`
|
||||
when you are compiling applications that `import gg`.
|
||||
... or it will generate:
|
||||
b) `int wmain(int ___argc, wchar_t* ___argv[], wchar_t* ___envp[]){`
|
||||
when you are compiling console apps.
|
||||
|
||||
-showcc
|
||||
Prints the C command that is used to build the program.
|
||||
|
@ -239,7 +252,15 @@ see also `v help build`.
|
|||
|
||||
-dump-c-flags file.txt
|
||||
Write all C flags into `file.txt`, one flag per line.
|
||||
If `file.txt` is `-`, then write the flags to stdout, one flag per line.
|
||||
If `file.txt` is `-`, write to stdout instead.
|
||||
|
||||
-dump-modules file.txt
|
||||
Write all module names used by the program in `file.txt`, one module per line.
|
||||
If `file.txt` is `-`, write to stdout instead.
|
||||
|
||||
-dump-files file.txt
|
||||
Write all used V file paths used by the program in `file.txt`, one module per line.
|
||||
If `file.txt` is `-`, write to stdout instead.
|
||||
|
||||
-no-rsp
|
||||
By default, V passes all C compiler options to the backend C compiler
|
||||
|
@ -257,7 +278,7 @@ see also `v help build`.
|
|||
Passing -no-std will remove that flag, and you can then use -cflags ''
|
||||
to pass the other options for your specific C compiler.
|
||||
|
||||
-assert aborts
|
||||
-assert aborts
|
||||
Call abort() after an assertion failure. Debuggers usually
|
||||
install signal handlers for SIGABRT, so your program will stop and you
|
||||
will get a backtrace. If you are running your program outside of a
|
||||
|
@ -267,3 +288,14 @@ see also `v help build`.
|
|||
Call print_backtrace() after an assertion failure. Note that
|
||||
backtraces are not implemented yet on all combinations of
|
||||
platform/compiler.
|
||||
|
||||
-thread-stack-size 4194304
|
||||
Set the thread stack size to 4MB. Use multiples of 4096.
|
||||
The default is 8MB, which is enough for compiling V programs, with deeply
|
||||
nested expressions (~40 levels).
|
||||
It may need to be increased, if you are getting stack overflow errors for
|
||||
deeply recursive programs like some of the stages of the V compiler itself,
|
||||
that use relatively few threads.
|
||||
It may be decreased, to reduce the memory footprint of programs that launch
|
||||
hundreds/thousands of threads, but where each of the threads does not need
|
||||
a big stack.
|
||||
|
|
|
@ -14,4 +14,4 @@ For more general build help, see also `v help build`.
|
|||
-os <os>, -target-os <os>
|
||||
Change the target OS that V compiles for.
|
||||
|
||||
The supported targets for the native backend are: `macos`, `linux`
|
||||
The supported targets for the native backend are: `macos`, `linux` and 'windows'
|
||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue