You should be good to go with even a modest rig on the Gravwell Community Edition at <2GB/day of data. Storage speed matters most.
You should be good to go with even a modest rig on community edition. We recommend having some high performance flash storage upfront to act as your hot system and then age out into slower spinners if you're looking for long term storage. Otherwise Gravwell will expand to utilize as much CPU power as you've got. Memory can be a factor, especially as you're learning the query language, but it's not crazy.
An example home setup runs on a Core i7-7700K with 16gb of RAM, with a small NVME drive, and btrfs spinners. It easily handles dns, pcap, netflow, ubiquiti logs, syslog, windows events, and random junk for the 15-20 systems on the network (while also doing dev work).
In the Gravwell drafthouse cluster, we spent $10k on some older hardware and put $10k worth of nice storage in a rig and that's doing a full Shodan feed, virustotal data, all reddit comments, pcap, netflow, TOR relay traffic, and more. About 700 gigs a day without any troubles at all.