Recently, we were debugging some performance issues with a client's Drupal Commerce website. After doing the standard optimizations, we hooked up New Relic so we could see exactly what else could be trimmed.
We're big fans of Elasticsearch. It is significantly easier to deploy, manage, and scale than SOLR in our experience. Since we've already been using Elasticsearch for indexing and storing system logs, it makes since for use to use Elasticsearch as the search backend for our clients' websites.
If you have a content manager that likes to use inline images in the Body field of a Drupal site, you might run into an issue where they want an image inserted at the beginning of the field but don't want it to display in a teaser. Of course, you could tell them to use the "Summary" part of the body field to manually set the teaser, but some content managers tend to forget about that, and you're also dealing with an issue of entering data twice. No one wants to enter anything twice if they don't have to (even if it is merely copy and paste).
We've all done it. When we needed to find something in a log, we just did a cat or a tail and piped it to grep. Maybe we told grep to also show us X lines before and after what it found. That works fine and dandy when you've got just one or two servers. But what about 30, or 300, or 3000?
About a year and a half ago, I came across a post by Miguel Jacq about deploying Drupal automatically with Jenkins. I had grown tired of the manual backup->upload->test->crossFingers->pray->yellExpletives approach to developing or upgrading a Drupal site.