You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Jan 23, 2024. It is now read-only.
I am testing/trying out fiber from my local machine. I am looking to use fiber processes to do a side-effect job (put data into databases) and use docker as the fiber backend. For testing, i have elasticsearch and postgress running in docker containers in a docker network called test.
I would like to pass network name as a parameter (just like the docker image) to the process running the docker container.
I tried it out locally and it works for me. This is the modification i made to the docker_backend.py file:
I am not sure how to pass the network in as a parameter. Possibly via job_spec ?
Questions:
Is it recommend to use Fiber process to do side-effect jobs, specifically use it and insert data into database?
If i have 5 places i want to put the data in (elasticsearch, redis-stream, postgress, other-places), is it recommend to use 5 fiber processes to insert data into the respective "databases"