Skip to content
all Blog



September 21, 2018

This series of posts has thus far examined technical principles and context. After diving into the CME’s iLink architectural change, we examined our overall approach to engineering. In our previous post we went deeper and discussed principles for pursuing simple designs. The final post in our series takes a very different tack. Today we examine a mindset driven by an aspect of humanity rarely examined in engineering circles: the emotion of fear.

Fear lurked in the background of our response to CME’s iLink upgrade. We were afraid that this change put our business at risk; that our competitors were ahead of us (we knew they used FPGAs and possessed a latency advantage) and that we must take this change seriously. And as the date drew closer, that fear bubbled more and more to the surface. Traders were making contingency plans should our success rates drop dramatically. Analysts were running numbers to see how we might be able to survive if we could not trade on the CME. And in the technology group, we were racing to get our new software and hardware tested, polished, and ready for day one. All that being said, this change is just one of an endless number of examples in which having a fearful mindset has played a key role in our eventual success.

Jim Collins’ book Great By Choice examines the common threads in companies that thrive in the midst of chaotic, volatile industries. One of those common threads encapsulates well what we mean by this fearful mindset: “Productive Paranoia

10xers distinguish themselves not by paranoia per se, but by how they take effective action as a result. Paranoid behavior is enormously functional if fear is channeled into extensive preparation and calm, clearheaded action, hence our term “productive paranoia”. … Like Amundsen sensing great risk in betting on unproven methods and technologies, they avoid unnecessary risks that expose them to calamity. Like Amundsen, they succeed in an uncertain and unforgiving environment through deliberate, methodical, and systematic preparation, always asking “What if? What if? What if?”

The trading industry and the financial markets are a prime example of an “uncertain and unforgiving environment.” Exchanges change their market structure. A stray twitter post lights up a random section of the market unexpectedly. A competitor who has been nipping at our heels finally leapfrogs us. A bug in a trading algorithm causes things to go awry. There are numerous famous, well-known, and expensive examples:

The markets are a scary place, a pure distillation of evolutionary dynamics which will quickly and ruthlessly referee against your mistakes. In designing, testing, and deploying our new systems we were keenly aware of this. We were continually motivated by two fears. The first was that we would miss this deadline, our competitors would leapfrog us, and our trading would suffer greatly. We viewed this transition as the key risk to our business at the time. The second was a fear that our new system might exhibit erroneous behavior and so incur substantial financial loss. Given that the new system reduced our latency by an order of magnitude, relied on technology which was new to our office, and was more opaque than software, this fear was particularly acute. We were keenly aware that we were entering a new realm of speed and complexity.

We chose senior, wise, humble, and ultimately fearful engineers to work on the project. We built in multiple layers of limits to the system including simple mechanical checks in hardware (e.g. a hard coded limit of no more than 10 messages per second by an FPGA). We planned for what we would do if we missed the deadline, or if we made the deadline, but our new system was not good enough. Like Collins says above, we continually asked “What if? What if? What if?” And in answering those questions and planning for those scenarios we diligently adhered to our principles: we examined our constraints, stayed disciplined in our process, and socialized our work. 

Optiver’s motto is “We Improve the Market”. Our pursuit of that mission requires building software which upholds the integrity of the market rather than cause its instability. When everything is going crazy in the markets, we want our trading systems humming along like always. And the only way you get there is by being very afraid of what could go wrong in the markets, and building a simple, solid, stable system to handle those wild moments. In the end, we want our engineers to be productively paranoid and have a fearful mindset as they build their software. Because that fear will drive them to add the extra test. Fear will nudge them to involve the extra person in their code review. Fear will force them to take seriously the offhand remark that reminded them of a line of code they changed thoughtlessly in a rush to meet a deadline.

The point is, ladies and gentlemen, that fear — for lack of a better word — is good.

David Kent, Chief of Staff – Technology

David is a Stanford Computer Science alum and spent several years as a developer at He joined Optiver as a Software Engineering Lead in 2009 and has led many of Optiver’s software development teams. He is presently Chief of Staff for the Optiver US Technology Group.