in my opinion, the biggest advantage of this refactor is readability, but yes - the frontend part has become really straightforward, i can say peaceful :) it just transforms the obtained arrays with ship marker data into geojson points that go onto map. not storing this stuff anywhere in js
Posts by Asta Areti
thank you for motivation :< ππ
prior content of the hash table with ships' coordinates. these were objects with latitude and longitude properties, encoded into bytes (with messagepack...) and stored under MMSI (=real world id) of a respective ship. hash table with type ids was more readable, because those are simple values, but still, it was separately...
content of the single hashtable with all ships' data now. organization of such compound keys under one table instead of making many hash tables per each ship (which is way more popularly advised working with redis) was chosen to be able to conveniently retrieve aaaall data to make checkpoints, in one simple HGET command
main part of the hosted service that accepts AIS data from gRPC connection. both coordinates and ship type data are saved in the same redis hash table under compound keys, like in the previous pic
main part of the hosted service that sends ships' checkpoints to frontend. all the entries are retrieved in one command, as it is said in the post. then checkpoint instances are constructed from entries (they are absolutely not guaranteed to come in order). the "mmsiShipMap" is a concurrent dictionary where keys are MMSIs as string (because they are cut out from hash entry names, which are strings), and values are the checkpoint instances. property values are applied to them along with finding hash entries with the same mmsi in the big collection received... and the results are sent by signalR in one action. prior to this, almost identically working merge by mmsi was on CLIENT side, ow ow ow, and caused "out of memory" errors from time to time π₯΄
i'm back to the maritime project! refactored redis storage of arriving #AIS!
β Before: 2 hash tables for ship types and coords, latter are stored as bytes, joining data on frontend
β
now: 1 hash table with clean fields, all marker props in 1 class
#buildinpublic #dev #redis #dotnet #aspnet #gis
have some file path shenanigans from windows which I bumped into when working with files in java
didn't think about getting a path referring to current folder by entering a dot . π
and in the last pic, I guess it was trying to find a folder name in input with the /
#java #dev #filesystem #til
initial problematic situation - I joined an edge loop into a face, and got this weird warp, initially I suspected that the problem was with normals
...so I flipped the normal of this face. I must say it helped, but partially - the weird warp decreased but didnt go away still
what resolved the problem: in Layout mode go to Geometry data (afaik you can just click on the green symbol near the mesn name) and Clean Custom Normal Data. I guess this helped because the model was imported from fbx...
problem fixed, no warp, no custom normal data either :D
found a random blender tip when patching a model (from time to time I try 3d :))
a face looked warped despite very simple geometry, I tried flipping normals (helped but partly), mark/clean sharp, more bevel, toggling autosmooth..
right answer was to Clean Custom Normal Data
#blender #3d #b3d #til
y'all won't believe who suddenly has to learn LUA
one of the most random and unexpected things professionally nowadays, but turned out I need some for the maritime project
(try to guess in the comments why :> π)
#ScreenshotSaturday #lua #dev
upd to the current api project I added a frontend of... asp net mvc app! (first time this kind π
)
discovered http handlers, essential to add jwt token to requests, and sometimes call refresh auth
also had to double the auth in frontend as a cookie
#buildinpublic #backend #dotnet #webdev #jwt
nuuu, why so sad? more like them choosing better style for the new role - of maybe a quieter, but historic place of detailed problem-solving and knowledgeable community (β β β βΏβ ββ )
service-level method where a refresh token is replaced. in other implementations, when refresh tokens have arbitrary IDs, this subtask can be done with an update operation. but as mine correspond to JWT token IDs, and refresh is by definition called when a jwt is outdated, as well as we can't update key properties with ef core - i gotta delete an entity with the old jwt, and add a new entity where everything is new except for the corresponding user ID
if y'all wonder from post aboveβοΈ how a 24 hr refresh token can correspond to a 15 min JWT:
- at /refresh current token is deleted from db by JTI from header's JWT
- new JWT and new R.T value are generated here
- then adding a R.T entity with only user ID left the same
#buildinpublic #auth #dotnet
upd I implemented refresh tokens for the same #API (for the first time ever π)
they are stored in cookies AND in DB, identified by a corresponding JTI (jwt token id) and last for 1 day
there is a /refresh action + also a refresh token is given at login
#dotnet #dev #buildinpublic #backend #auth
example of some tables breakdown in pg4admin gui, demonstrating how there are NO indexes created on primary keys shown 0_o (in the first table see, the "indexes" tab is opened too)
checking the structure of the same tables from previous pic with the psql tool (still in pg4admin app). here the indexes listed include these on primary keys
sql server management studio's table breakdown gui for comparison, primary key indexes are shown
today i learned... rather confirmed my suspicion that sql makes indexes for primary keys
but why doesnt pg4dmin show them in table dropdown, I already thought that stuff works differently in various DBMSes
whew, no, here they are PK indexes seen in psql
#sql #postgres #sqlserver #dev #database
I'm on windows 11 and docker Desktop with WSL2 backend
so thaaats when "SuccessRehashNeeded" is used, that's the first time I see it emerging in practice, interesting :0
aww thank you it cheered me up π
service method that creates a jwt token. no jwt/identity specific types in input or output (thats my viewmodel). claims in the token include not only obvious user properties but also Sub(ject), user id - I didn't use it before - and Jti, token id. token's time properties, validity start and expiry, are saved in utc (earlier inexperienced me used local time, dont do that)
program.cs part where jwt authentication is added. key is generated dynamically!!)) from 256 crypto random bytes (earlier I used 2 guids squished together to make 32 bytes, but that's not enough entropy). of course default scheme is set to jwt, don't forget to set challenge scheme too! or else unauthorized requests will be redirected to a /login action, even if it doesn't exist, and in total weirdly result in 405 instead of 401! next i'm validaiting not only key and lifetime, but issuer and audience too. and discovered such thing as clock skew! extra lifetime for tokens can you imagine! by default its 5 minutes - so i was setting 15 mins of lifetime to my tokens but actually they were made valid for 20! so i zeroed it of course
program.cs part where authorization for swagger is defined. basically an Authorize button appears in swagger UI that opens a slot where you insert a token, and then this token is added to your requests. (thats completely new to me! made this thingy for the first time! earlier when tokens came into view, for using an API I had to to switch from swagger to postman :'>) here I configured a basic pair of security definition and security requirement. as you can see, its specified that the protocol used is HTTP, auth schema is Bearer, more detailedly the Bearer format is JWT and token should be situated in the request header. and apparently to join this security definition + requirement pair, the definition name should be the same as the reference ID in the security scheme, in this case JwtBearerDefinition
example of how swagger auth works. in this slot I paste the token and in my requests an Authorization header with Bearer scheme and this token gets added. which you can see in the background, as well as a successful response from an action that needs authorization
yesterday added JWT authentication in a practice API
clearly improved in making JWT this time:
β
token expiry in UTC
β
generating key from 256 random bytes
β
validating audience not just issuer
β
using Sub claim (=user ID)
β
made an auth slot in swagger!
#dotnet #dev #jwt #backend #buildinpublic
chose tiktok mainly due to powerful algorithm that can boost videos unlimitedly not depending on prior social capital, global preference for shortform content, and cause I rarely can consistently sit from start to end coding to potentially comment and make it a true to speed video for for example YT
first I plan to showcase the steps of main maritime project + occasional side ones that were mentioned here, I recorded the development of the majority of features, but was too busy to edit videos. after I run out of these, I'm looking forward to crosspost - a post here asap, a tiktok 1-2 days later
freeens i have a #tiktok now, the genre is definitely #buildinpublic
will try to post timelapses of features being made there
www.tiktok.com/@asta.a43t1
already had the silliness to somehow edit a timelapse with inshot, pretty bad idea π΅βπ«
remember this small #api about currencies exchange? lets take a break from the maritime project =) πͺ
I implemented custom logging into a text file for when the api is launched NOT in docker (β Β β κβ α΄β κβ ) file path is specified in appsettings.json btw
#dotnet #devlog #buildinpublic #devops #aspnet
these are EF Core Design and Tools, so far in every project I need migrations. yes I know, boring answer maybe π earlier it also was Newtonsoft.JSON but nowadays I don't like its benchmark, switched to System.Text.JSON
upd in the maritime project I made the ships markers size adapt to zoom level: smaller on near/global zoom so it doesn't look like complete clutter, bigger on detailed zooms to see them better
+ some transparency on near/global, for the density map effect β¨
#buildinpublic #maps #angular #webgl
aww thank you (β β‘β Β w β‘β ) stay here, this project has yet a long list of features to be added
frens I'm back and I implemented ships coloring according to their types π₯Ή
not everyone streams their type, hence there still are gray points, but for reference:
π cargo
β€οΈ tankers / rescue
π passenger / high speed
π· (lighter blue) - service
π§‘ fishing
π leisure
#buildinpublic #maps #angular #webgl
I finished AIS data travel and now have ship positions on the map!)
there's webGL renderer for the layer, so it shows 18k-24k points that arrive every 2 sec easily peasilyπ₯ though I know there are ~100k ships with ais, my source is pretty eurocentric...
#buildinpublic #angular #maps #geosky #webgl
example of ais data sent to browser via SignalR, in the picture these are ship infos, but I can (and will) send coordinates too
Angular proxy configuration file to make SignalR on the frontend connect to https deployed backend (instead of http by default)
including proxy configuration in the Angular project
upd I implemented signalR from backend to browser in maritime project, completing the #ais data transfer from public api to my frontend! (β Β β βΉβ β½β βΉβ Β β )
btw traffic is sent every 1-2 sec in bulk, in between the sends it is stored in #redis hash table
#buildinpublic #signalr #dotnet #angular #geosky
successful gRPC communication: the AIS microservice on the right, sending out stuff, and the main backend to the left, accepting all that (you can notice same ship names in both logs)
protobuf classes for the gRPC: main whole message with a body about either: ship position, ship data or safety notice
solution structure so far: shared class library project housing .proto files and classes generated from it, main backend and the ais consuming microservice
implemented grpc communication in the maritime project: sending messages from ais microservice to main backend (in just as real time as receiving them)
don't get confused about the types, I really map both kinds of ship data messages to one result type
#grpc #dotnet #backend #buildinpublic #geosky
refactored the AIS traffic processing into asynchronous producer-consumer pattern
I initially considered using concurrent queue, but learnt about TPL channels and they turned out more effective for a hundred of messages per second π’π’π’
#buildinpublic #dotnet #csharp #ais #refactor
upd I now parse the marine traffic! οΌ("0 ")οΌ
especially I love safety messages, just look at the 2nd pic, to read them feels like I'm a participant of the business somewhere there
have a safe watch you too, random person at the other end of the worldπ₯Ή
#ais #maritime #dotnet #buildinpublic
ais position message entity class with a json serializer context class configured for this entity. the nested structure may be weird looking, but I'm merely repeating after the messages I receive from API (next pic)
AIS API position message structure from documentation
yesterday I used source generated JSON deserialization
it completely removes the reflection stage which *really* *significantly* improves performance
very relevant for me, as I'm getting barraged with up to couple hundred AIS messages per second
#dotnet #buildinpublic #ais #geosky
AIS data just straight up written from every message received from the web socket
AAAA I CONNECTED TO REAL MARINE TRAFFIC! ππ(β β―β Β°β β‘β Β°β οΌβ β―β οΈ΅β Β β β»β ββ β»
(these are real AIS messages from a public source, connected with websocket)
aisstream.io π
#maritime #navigation #geosky #buildinpublic #dotnet