Storm, a framework for real-time data processing in Hadoop, got a major promotion. After joining the Apache Incubator in September 2013, it's now a full-blown, top-level Apache Foundation project.
Storm's main application is the processing of streaming real-time data (or "fast data," per John Hugg's description). Its processing power is designed to scale across multiple nodes, with up to 1 million, 100-byte messages per second per node as an advertised benchmark. As with most other work done for Hadoop, Java is the most broadly supported language for working in Storm, though other languages are in the mix.