jump to navigation

Algorithms get pampered by Microcontrollers, Gate Arrays and EPROMs July 6, 2009

Posted by jbarseneau in Uncategorized.
2 comments

A while back a handful of companies addressed their computational scalability by looking at the processing units they were using– and they looked very closely. These players in hardware acceleration are still speaking to Wall Street customers. In messaging, you have Tervela and, a secondary player, you have Xambala, which put a parser on a chip, and thirdly ACTIV Financial demonstrated its solution — a ticker plant coded directly into a silicon chip.

The electrical and electronic engineers that have design and develop Central Microprocessing Units (CPUs) know that if anyone had a single repeatable function that was required to be computed by the CPU, it may perform its job much quicker if the CPU went on a “physics” diet. i.e. slim it down, throw it all out in the garbage, clean out the attic and downsize the house; reduced instruction sets, optimized on chip memory and so on. Well all of the three companies have done that. Either for routing purposes or a hardwired ticker plant, companies are going “hard” wired for EPROMs, Gate Arrays, and custom ICs.

More to Come…

Quantum Again… July 6, 2009

Posted by jbarseneau in Uncategorized.
add a comment

So back to a fun topic that seems to have a great following, both by naysayer and sayer alike. There are a number of quantum computing candidates, among those. I have done some preliminary work with the guys at D-Wave and I feel these machines will be very useful tool in financial engineering in the near future. To remind ourselves of what is considered a quantum computer lets look at what vid DiVincenzo, of IBM, listed the following requirements for a practical quantum computer:

  • scalable physically to increase the # of qubits
  • qubits can be initialized to arbitrary values
  • quantum gates faster than decoherence time
  • Turing-complete gate set
  • qubits can be read easily

There are a number of practical difficulties in building a quantum computer, and thus far quantum computers have only solved trivial problems. One major problem is keeping the components of the computer in a mixed state as the slightest interaction with the external world would cause the system to decohere.

This section surveys what is currently known mathematically about the power of quantum computers. It describes the known results from computational complexity theory and the theory of computation dealing with quantum computers.

The class of problems that can be efficiently solved by quantum computers is called BQP, for “bounded error, quantum, polynomial time”. Quantum computers only run randomized algorithms, so BQP on quantum computers is the counterpart of BPP on classical computers. It is defined as the set of problems solvable with a polynomial-time algorithm, whose probability of error is bounded away from one half. A quantum computer is said to “solve” a problem if, for every instance, its answer will be right with high probability. If that solution runs in polynomial time, then that problem is in BQP.

BQP is suspected to be disjoint from NP-complete and a strict superset of P, but that is not known. Both integer factorization and discrete log are in BQP. Both of these problems are NP problems suspected to be outside P. Both are suspected to not be NP-complete. There is a common misconception that quantum computers can solve NP-complete problems in polynomial time. That is not known to be true, and is generally suspected to be false.

An operator for a quantum computer can be thought of as changing a vector by multiplying it with a particular matrix. Multiplication by a matrix is a linear operation. It has been shown that if a quantum computer could be designed with nonlinear operators, then it could solve NP-complete problems in polynomial time. It could even do so for #P-complete problems. It is not yet known whether such a machine is possible.

Although quantum computers are sometimes faster than classical computers, they can’t solve any problems that classical computers can’t solve, given enough time and memory. A Turing machine can simulate a quantum computer, so a quantum computer could never solve an undecidable problem like the halting problem. The existence of quantum computers does not disprove the Church-Turing thesis.

Advance Visualization Techniques in Finance July 21, 2006

Posted by jbarseneau in Trading, Visualization.
add a comment

Precipitated by the recent acknowledgment by an award being given to the company Fractal Edge for its innovative work in financial visualization, and my, timely introduction to a gifted “techno-artist” named W. Bradford Paley, I was compelled by the opportunistic, and appropriate, reason to address the ever challenging duty of delivering meaningful advancements in the field of financial visualization. I must admit that before I met Bradford that I was only aware that I understood pictures more than a long string of lexical and syntactically correct text describing an abstract concept. But his passion and intuitive style of presenting visualization techniques revitalized my spirit, and my belief that the “Holo-Deck” was not “crap”. That there is indeed much more available when you put your mind, or cerebral cortex, to it; which he does to our gratitude!

Many cognitive studies have demonstrated convincingly that human cognition relies more heavily on visual than numeric stimuli. People absorb and process large amounts of visual data using the visual cortex, a specialized area of the brain, every second. Despite this fact, the most common basis for presenting and analyzing quantitative research in financial engineering is still numerical, e.g., spreadsheets and tables. Some have argued that visualization is less achievable for financial applications because of the high dimensionality of the problems. These arguments have been refuted dramatically in other highly quantitative disciplines such as fluid dynamics, electrical engineering, mechanical engineering, molecular biology, and meteorology, in which visualization has come to play a central role in both basic research and industry applications. 

(more…)

The Application of Quantum Computing in Finance 1.0 July 6, 2006

Posted by jbarseneau in Uncategorized.
7 comments

I know…. I’m a crazy for talking about The Application of Quantum Computation when we still only have 8 QUBIT computers , still severely challenged by the reverse salient problem of software applications before hardware development, error correction, decoherence, and practical construction issues. But what the hay, the internet creep up on Microsoft quickly and I think anyone serious about advanced computing for finance needs to keep the pulse on this ground breaking development and make sure they stay on top of it. More so this is important to us because we know banks will be the first to invest when these machines start spinning their Quarks in a robust manner.

First, a very quick background: In the early 1980s, renowned physicist Richard Feynman began to investigate the possibility of having a quantum computing machine that could simulate quantum systems as conventional computers simulate classical physical processes. He considered the “representation of binary numbers in relation to the quantum states of two-state quantum systems”. While classical computers of the mid 20th century utilized as their basic unit of information the “bit”, which could represent at any one time either “0″ or “1″, a quantum bit, or “qubit”, would harness the power of quantum mechanics in order to represent “0″, “1″, or a superposition of both. Thus a quantum computer would be able to simulate quantum mechanical processes that classical computers take too long to do or are entirely unable to handle.

(more…)

Fund of Funds and Network Science June 30, 2006

Posted by jbarseneau in Uncategorized.
add a comment

A fund of funds is just a fund which invests in other funds. Just as a mutual fund invests in a number of different securities, a fund of funds holds shares of many different funds. These funds were designed to achieve even greater diversification than traditional funds, and also to put to work large amounts of capital. On the downside, expense fees on fund of funds are typically higher than those on regular funds because they include part of the expense fees charged by the underlying funds. In addition, since a fund of funds buys many different funds which themselves invest in many different stocks, it is possible for the fund of funds to own the same stock through several different funds and it can be difficult to keep track of the overall holdings.

Many mutual fund holders also suffer from being over-diversified. Some funds, especially the larger ones, have so many assets (i.e. cash to invest) that they have to hold literally hundreds of stocks and consequently, so are you. In some cases this makes it nearly impossible for the fund to outperform indexes – the whole reason you invested in the fund and are paying the fund manager a management fee. As the sage words of the “Oracle of Omaha”, Warren Buffett: “wide diversification is only required when investors do not understand what they are doing”.

When analyzing investment performance, statistical measures are often used to compare ‘funds’. These statistical measures are often reduced to a single figure representing an aspect of past performance. Alpha represents the fund’s return when the benchmark‘s return is 0. This shows the funds performance relative to the benchmark and can demonstrate the value added by the fund manager; the higher the ‘alpha’ the better the manager. Alpha investment strategies tend to favor stock selection methods to achieve growth.  Beta represents an estimate of how much the fund will move if its benchmark moves by 1 unit. This shows the fund’s sensitivity to changes in the market. Beta investment strategies tend to favor asset allocation models to achieve out performance.  R-squared is a measure of the association between a fund and it’s benchmark. Values are between 0 and 1. 1 indicates a perfect correlation and 0 indicates no correlation. This measure is useful in determining if the fund manager is adding value in their investment choices or acting as a closet tracker mirroring the market and making little difference.  Standard deviation is a measure of volatility of the fund’s performance over a period of time; the higher the figure the greater the variability of the funds performance. High volatility is an indicator of increased investment risk in a fund.

A large Fund of funds has so much capital to put to work, efficacy becomes the defining bottle neck. In other words at some point of capital placement each incremental new placement does not improve the overall return and in a lot of cases diminishes the returns. Fund to funds conduct large complicated, recursive algorithms on each placement, and each position in each placement, to look at the alpha or beta of the scenario.

Duncan Watts is the premier expert on Network Science and is a professor at Columbia University. He has written two seminal books entitled “Six degrees” and “Small Worlds”.

It may be interesting to model the Fund to Funds world as a network and determine if it is a well structured network or not. This may lead to less computationally intense ways to find capital placement opportunities and improve the efficacy of large funds.

Follow

Get every new post delivered to your Inbox.