Hello Grafana Community,
I am seeking advice on building an architecture to handle a substantial amount of data and logs. Here is the proposed setup:
3 servers for GETs: These servers will handle both read and write operations and get Traces from Alloy agent.
3 servers for GELs: These servers will collect logs from alloy agent
And they are used for visualizing in Grafana.
The raw data volume is expected to be around 100-150 GB per day.
My questions are as follows:
- GET Servers: Can I configure all components to perform both read and write operations on each of the 3 GET servers? Additionally, can these servers handle the traces and send them to Grafana?
- Data Distribution: Will the 3 GET servers be able to manage 100 GB of data, distributed equally (approximately 33 GB per server per day)?
- Component Allocation: Should I separate the read and write components onto different servers to better handle this volume of data, or is it feasible to have both operations on the same servers?
- Storage: How much blob storage is required to store Traces and logs in 2 separate DBs and has 30 days retention period.
I appreciate any insights or suggestions you can provide regarding this architecture.
Thank you in advance for your assistance.