## Posts Tagged ‘**math**’

## Order book temperature

The temperature is one of principal quantities in thermodynamics and it is a macroscopic intensive variable because it is independent of the bulk amount of elementary entities contained inside. Let’s try to move physical definition to trading world. Thermodynamics defines temperature as:

where is entropy and is internal energy of the system.

In statistical mechanics, entropy is a measure of the number of ways in which a system may be arranged, often taken to be a measure of “disorder” (the higher the entropy, the higher the disorder). This definition describes the entropy as being proportional to the natural logarithm of the number of possible microscopic configurations (microstates) which could give rise to the observed macroscopic state (macrostate) of the system. For sake of simplicity we assume the constant of proportionality equal to one:

Order book is in fact a set of all buy/sell orders. Let’s denote it as where b (s) is price and B (S) is amount of contracts at given price of buy (or sell) orders. Let’s normalise it by total buy (Tb) and sell (Ts) contracts:

Thus the entropy becomes a sum of entropy of buy and sell sides:

The internal energy is the total energy contained by thermodynamical system. It is the energy needed to create the system, but excludes the energy to displace the system’s surroundings, any energy associated with a move as a whole, or due to external force fields. Thus to create the order book one needs to have all money of buy side and to own securities of sell side. There could be doubts how to price securities of sell side but we’ll take the easiest approach:

Let’s try to derive a formula of temperature under the given assumption. At first, the total differentials of entropy and internal energy should be obtained:

Then we can find derivative of entropy by energy by total derivative definition:

where

And substitution into the total derivative yields the formula for temperature:

## Markov chain for Geometric Brownian Motion parameters

A Markov chain is a discrete-time random process with Markov property. Its components are states and probability transitions between them. Markov property states that the probability of next states depends only on the current state.

So Markov chain is a set of states and all transition probabilities between states.

## Simple chain for drift

Let’s assume that estimation of drift parameter might lead to the following 2 states:

- Positive, i.e. drift is greater or equal zero
- Negative, i.e. drift is less than zero

So, one could construct Markov chain for these states as it shown below.

## Maximum Likelihood Estimation of Stochastic Process Parameters

Maximum Likelihood estimation (MLE) is a method of parameter estimations of statistical model. The base idea is to establish joint density probability for observations and to maximize its value by model’s parameters. To say it differently, we are looking for the most probable explanation of observed data.

### Problem setup

Assume that we’re given a one-dimensional stochastic process:

where and are some functions of arguments .

We observe this process by measuring $\latex S_i(t_i)$ where . For sake of simplicity assume that observations are equidistant in time, i.e. .

So, let’s estimate parameters.

## Common trading task – Part 1

## Stochastic process

As Wikipedia suggests, a stochastic process is a random process, the counterpart to a deterministic process. For our simple tasks all we need is a time series, i.e. for each moment of time we have only one random value, or price: . The process has a definite starting point but its further evolution has some degree of uncertainty described by probability distribution.

A lot of types of stochastic processes has been studied in mathematical literature. In this article I use only Itō processes as they provide quite good approximation of price dynamics.