According to Flume wiki, it has capable of achieving approx. 70,000 events/sec on a single machine (high-end level) at the time of the test with no data loss (300 bytes per event). If you can't imagine how big 70,000 events/sec is, see this "Tweets per second".
Today suddenly someone asked me about Bioinformatics, Big Data, and MapReduce, I said "you should be more concerned about data complexity than size. And also, studying GraphLab or Hama will be helpful" ... :/
Memorise these, and you could pass yourself off as an expert.