Tag Archives: Quant analysis

Quant ANALYSIS insights

Quant ANALYSIS insights

1.1 Video: Principal Component Analysis of the Swap Curve

This video shows the geometrical interpretation of the classical parallel shift/steepening/twist PCA analysis of the term structure of interest rates.
The first factor (parallel shift) is a movement along the first principal axis of the ellipsoid defined by mean and covariance, and similar for second and third movements.
The high correlation among interest rates is reflected in the long, flat ellipsoid. Indeed, the first eigenvalue (~length of longest axis of ellipsoid) is much larger than the second (~length of second axis of ellipsoid), which is much larger than the third (~length of third principal axis of ellipsoid).

Join discussion ]

Principal Component Analysis of the Swap Curve
1.2: Video: distribution of the grade (intro to copulas)If we feed an arbitrary random variable X through its own cumulative distribution function (cdf) we obtain a random variable U, called the grade of X.
U has a uniform distribution on the interval [0,1] regardless of the original distribution of X.
This video provides the intuition behind this simple but powerful result, via an empirical application to X~Normal and X~Log-Normal with varying parameters.
The grade U has a uniform distribution because:
– the cdf of X is steep on intervals of high probability (blue, in the video) and thus dilutes the dense samples of X;
– the cdf function of X is flat on intervals of low probability (pink, in the video) and thus concentrates the scarce samples of X.

Join discussion ]

Distribution of the grade
1.3 Slides + videos: Linear Factor Models (Part 1)Linear Factor Models are as ubiquitous as misunderstood in quantitative finance: systematic strategies, smart-beta, Fama-French, APT, CAPM, regressions, idiosyncratic risk, r-square, covariance inversion, factor alignment, simulations, etc. How do they all come together?
In this issue we start addressing the theory and classification of Linear Factor Models.

See presentation ]   [ Join discussion ]

1.4 “Did you know?”

  • The risk of a random walk does not always grow as the square root of time  [ Comment ]
  •  Robust estimates of the covariance matrix with series of different length are not computed with the EM algorithm  [ Comment ]
NOTE I now post my TRADING ALERTS into my personal FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!

4 days for ABSOLUTE LOWEST rates you will get to learn trade secrets in HFT, quant analysis, and starting funds

Hi there

The 4 day countdown has begun for the absolute LOWEST rates you will get in my QuantLabs.net Premium Membership. It goes up 50% come this Tues Jan 21 no ifs, no buts, .end of story. Remember I tried to be a nice guy to allow for an extra week for fence sitters.

I have done some serious research over the last day on what I will consider trade secrets in rapidly building trading models and strategies. This is leveraging off the power of Matlab’s environment. I am also looking into potentially starting a fund through some sort of partnership arrangement. This is how serious I am with my learnings.
1. Youtube video on TRADE secrets for Hedge Fund and HFT starting to get posted on quant membership
It is getting serious now as I have some highly people qualified helping out.
2. Matlab Simulink could be a perfect way to build logic for your HFT kill switch which will be required by federal regulators like SEC
This is something every HFT operator should be aware of. SEC is proposing a hefty $400K/min for any automated algo that goes off the rails if it is not corrected.
3. Is SFA the legit way in Switzerland to raise international capital. Even the cantons offer free office space for your fund business?
Sure I do get some serious UNSOLICITED capital offerings which is making me think very seriously about a fund. Maybe in Switzerland?
Again: This is the absolute LOWEST rate you will ever get going forward. 
Need further convicing? Go here for the benefits.
Thanks either way
Bryan
NOTE I now post my TRADING ALERTS into my personal FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!

Why we moved from Matlab to R for quant analysis for model development and building a trading strategy

Why we moved from Matlab to R for quant analysis for model development and building a trading strategy

Two questions from a visitor to the site

> how do i sign up?
> any tools for matlab, or purely R (also fine)?

We do have some Matlab courses as I started out in that. I do now focus on R as I need to parallelize and cluster simulations which is very very expensive in Matlab.

GO here to sign up.

 

NOTE I now post my TRADING ALERTS into my personal FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!

Here is Bloomberg TV’s most talked about model quant analysis and trading strategies by hedge fund managers and institutional traders

Here is Bloomberg TV’s most talked about model quant analysis and trading strategies by hedge fund managers and institutional traders

Here are the types including:
yield curve analysis

Fat tail analysis

Beta

Correlation

This is what I plan look at in building my models. R coding links coming to so check the R-Blog at quantlabs.net/r-blog

NOTE I now post my TRADING ALERTS into my personal FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!

Is this big data? What metrics would you run against this dataset for quant analysis?

Is this big data? What metrics would you run against this dataset for quant analysis?

The dataset consists of 65,000+ customers, 2,000+ skus, served through 20+ channels. The dataset is comprehensive, granular profitability calculated at the intersection of the customer, channel and product. ~300 million line items are calculated monthly, and includes YTD data. This is accomplished in a closing window of <18 hours, as subsegment P&L’s are generated from the recordset. Do you consider this big data? If so, or if not, why?

@quantalyst

 

==

Nope. Too small. Easily fits inside a relational model.

 

==

Do you plan to merge this data with any social media or other semi structured data (images, financial documents, etc)? What is the velocity associated with the collection of this data? I am guessing there has been some data modeling done here but what rate of change do you anticipate with the different data assets that you plan to use? Some of these items will provide insight into if this would leverage a big data solution or not.

However, to a point if the data footprint can fit into a database and the data asset and elements within it can be serviced well in a relational solution…… pursue that first.

 

==

this is certainly at the smaller end of the spectrum of what is generally accepted as Big Data. Big Data is typically defined by variety and volume of data, but add in velocity and that can all change. You say you’re analyzing this all in monthly batches in under 18hours. Your data can become ‘big’ if your company needs to start getting this information weekly or even daily to benefit from tracking customer/market trends or evaluating channel metrics and performance and enable more agile decision making. Then you need to deal with bringing that 18 hour window down to say 3 or 4, as well as reducing your nightly batch windows by enough to accommodate the new process – a lot easier and cheaper than people tend to think, but still requires an investment. Decide what data is most important to you now, how often and quickly you need that data and base your solution strategy on that.

 

==

Thanks – appreciate the responses…more to come, but I’ll start some new threads. Some issues…velocity – definition and approprate standard measures….RDMS – don’t we have to get the data into some type of OLAP to actually analyze?

 

==

it depends on what the analytics are. If you can do what you need to in a relational format easily and are thinking in terms of cubes, you’ll likely be better served doing that. If, however, your needs include flexibility to expand/add/drop data sets for experimentation, analytics where a query structure isn’t the most natural way to express the question, or is computationally very expensive (think machine learning for pattern identification) even if the data set itself is not “big”

==

 

There is this expression “… If all you have is a hammer, then all your problems look like a nail”. … I’m sure I fractured that statement, but you get the idea…

Based on your initial problem definition, you have to ask yourself if you can easily fit the problem in to an RDBMS. Meaning the data does not require the scale of Hadoop.

Based on your initial numbers… Not even close.
It’s surprising that while Tom works for a large vendor that’s spending lots of money, attempting to buy market share, he forgets that his company also sells two RDBMS engines that are probably better equipped to solve your use case. Or rather one of those engines. (This happens to be that proverbial hammer. 😉

Using IBM as an example, check out IDS.
Here you can use the engine as your OLTP source. Which is what you will want since you are talking about a system of record. You have built in extensibility in that you can extend the relational model. See Stonebraker’s Illustra that IFMX bought in ’95. You have this thing called RTL where you can load over 50k of tick data a second. So you can really handle velocity. (Note: you added velocity as an after thought and your data uses do not suggest that level of velocity. RTL would be overkill. ) but that same extensibility allowed them to create IWA. Essentially an all in memory appliance which you can attach to your RDBMS engine and do queries across both machines using an industry standard SQL.

Of course there are limitations in terms of scalability. However for your data set size, they could be an option.

In terms of Analytics… There used to be a partnership with NAG hence the NAG data blade… So you’ve got that covered.

It’s a pitty that Janet killed Arrowhead. Had it gone through things would have looked a bit different in terms of the big data space.

But I digress. The point is that I can solve your problem with a different toolset. As someone who is not a talking head, but is actually working as a solutions architect, each solution has its share of trade offs. You have to balance the pluses and minuses when trying to ind the ight solution for you…

 

==

evidentially missed the part where I said “If you can do what you need to in a relational format easily and are thinking in terms of cubes, you’ll likely be better served doing that.”

 

==

You were just regurgitating what was already posted in earlier responses. 🙂

You went on to make. Comment about how it’s not always the size of data, but the complexity of tha Analytics…
Which again I point to advances in IDS that have been stable for the past 10+ years that handle complex Analytics.

Again size and complexity point to a non Hadoop solution. Add velocity which the OP did and again IDS solves that issue.

While you work for IBM, in IM, you don’t really know your own product sets. Typical for IBM. Don’t feel bad though, I seriously doubt there are any if not a handful of people who could tie all the products in IBMs portfolio together…

Like I said in a terse post way at the top… Not a big data problem…;-)

 

=

Statistics are maintained at each of the 300 million data points. The stats first, drive costs to the appropriate channel/customer/product; and, second intersect the costs to arrive at a cost at the channel/customer/product.

Some of the statistics and sources include:
..a) Route Management System (#Services, Channel_ID, TimeOnRoute);
..b) “Hand Held” System – time stamps (#MinutesAtService for various activities);
..c) Warehouse management system (labor activity distribution);
..d) 3rd Party Freight Management System (Freight Lanes and Costs);
..e) Inventory (#QuantityOnHand);
..f) Certan specialized databases (Assets, Payroll)
..f) ERP transactional system including Inventory Transfer (SKU counts across lanes), G/L (costs); Order Entry (order counts by SKU & Customer)

The 300 million result set is a single source for all sku/customer/channel reporting and analytics giving management transparency into the customer supply chain. It drives tactical and strategic decisions including; a) Subsegment P&Ls; b) Pricing (value of customer relationship); c) Process Improvement; d) Logistics optimize; e) Product release profile; f) new customer profile.

Yes, it is RDMS/OLAP driven, monthly (auditable to the G/L). It seems that the consensus is that calling this “big data” is a stretch….maybe “very large data”. However, this is driving decision making – is “big data” doing the same?

@quantalyst

 

==

If you want to fix the 18 hour job cycle then, yes, some big data technologies might be useful for parallelizing the analysis. Hadoop processes don’t really allow you to pack much algorithm in one pass. You could make a preprocessing phase that creates a common input dataset, and then run separate parallel algorithms on that for your various analyses.

There are open source OLAP databases (Pentaho for one) so you can have several servers, each running a different analysis.

 

NOTE I now post my TRADING ALERTS into my personal FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!

72 algo strategy development course for High Frequency Trading videos already posted, FREE daily quant analysis, and more

72 algo strategy development course for High Frequency Trading videos already posted, FREE daily quant analysis, and more

Hey
Here is the itinerary of my new online Algo Strategy Development course. Over 72 algos are already preloaded ready to go! This is only available to Premium Members:
1.    Evaluating performance and HFT strategies
2.    Order types for HFT
3.    Market Inefficiency and Profit Opps in different frequencies
4.    Searching for HFT opps
5.    Working with tick data
6.    Trading on market micro structure
7.    Event arbitrage
8.    Statistical arbitrage in HFT
9.    Managing Portfolios
10.    Back testing trading models
11.    Risk management
12.    Executing and monitoring HFT environment
13.    Post trading profitability analysis
New videos are being posted as well which will be posted frequently on each new section. Get in on the affordable membership rates now!
We also got a new Membership Benefits section as well which gives you access to this course:
http://quantlabs.net/quant-member-benefits/slash-your-quant-learning-curve/

Don’t forget this also gives you access to our exclusive QuantLib and Open Source High Frequency Trading Platform course with our 2 gig forex database. Did I also mention you get access to our countless number of software development examples?

Our free quant analysis is also live which reports best performing stocks from US, UK, Canada, and Australia daily:

http://quantlabs.net/analysis/

There have lots been going on so make sure you get in on that quant algo/strategy development course as many practitioners  already are.

http://quantlabs.net/dlg/sell.php?prodData=m%2C3

Thanks Bryan

PS. I got an introductory video on this course
http://www.youtube.com/watch?v=-mn9lzzYz6o

NOTE I now post my TRADING ALERTS into my personal FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!

Free quant analysis with 30 models from best performing daily stocks and market security assets

Free quant analysis with 30 models from best performing daily stocks and market security assets

http://quantlabs.net/analysis/

 

NOTE I now post my TRADING ALERTS into my personal FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!

Youtube video opinion MATLAB webinar on texhnical vs quant analysis,HFT, models, easy FPGA GPU deployment

Youtube video opinion MATLAB webinar on texhnical vs quant analysis,HFT, models, easy FPGA GPU deployment

Learn how I do with this learning here.

Webinar link

http://www.mathworks.com/company/events/webinars/wbnr52491.html?id=52491&p1=801729410&p2=801729428

NOTE I now post my TRADING ALERTS into my personal FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!

What indicators do you use for technical analysis and quant analysis?

What indicators do you use for technical analysis and quant analysis?

 

==
this is my latest developed theory: http://www.youtube.com/watch?v=2_PTmXkK2Qo&feature=related

==

 

ADX, RSI, Moving Averages

 

==

Moving averages, SST, ADX and Volume. Keep charts as simple as possible and understand them are the key for the traders.

 

==

Don’t use any indicators. Indicators by nature are lagging in nature. You can trade successfully, without any indicators.

-=

 

I trade without indicators using simply price action.

 

==

I trade without indicators. Price action and trend lines with cyclical analysis modelling. that’s it. Using technical indicators is just as good as asking your mum if you’re a good boy.

 

==

Its a great question that helps pin point your trading style. I like using a combination of support and resistance trend lines (which are subjective) and the MACD on break outs and break downs.. Breakouts I find usually coincide with momentum, which the MACD attempts to capture. On pairs (or mean revering securities like long/short or pairs (pair options), I like to use Bollinger bands.

 

==-

That’s easy for me. It’s the RSI. The way I use it for trend analyzes, identifying trend change ( reversal ) and it’s ability to forecast future price targets and objectives.

 

==

 

 

Fibonacci, Stochastics and Fx Pulse – trend detector. I can say that Fx Pulse nice and free MT4 indicator. It not only shows trend for all time frames in real time but also saves your time displaying latest Forex News headines. Feel free to download Fx Pulse and other indicators and Forex calculators at http://pipburner.com/free-forex-trading-tools/

 

==

I think the most important tools is yourself . Putting together all the information CCI RSI Price Action whatever. Its your interpretation of that information that matters. There is no secret combination of indicators or candle reading that’s going to work every time. The best indicator in my opinion is IMMERSION. ” The harder I work the luckier i get.”

 

 

==

 

 

==

We have a multi layered system. It begins with our Early Warning program which is a very early alert to momentum changes. It is very early and we do not trade from it.http://www.quacera.com/tools-and-reports/early-warning-report/
Our next system is QPM Radar™ which shows very accurately when individual securities are moving positive or negative: http://www.quacera.com/tools-and-reports/daily-qpm-radar-report/

 

 

==

Andrews Pitchfork (leading Indicator) and Triple Moving Average for setup, entry and exit. Very Simple.

 

 

==

SMA 3x13x39 system; DiNapoli MACD; DiNapoli slow stochastic. 2 Bollinger Bands Overlayed with the deviation of the second Bolli changed from 2 to 1 thus creating an outer channel. If the Candles are moving along in the outer channel the market is trending. If just bouncing along sideways, stand aside.

 

==

B.Band,RSI,MACD,Parabolic SAR(not much), Volume and sometimes ATR

 

 

==

Bollinger bands, MACD and RSI

 

==

I currently use bollinger bands, RSI, Volumes and two moving averages. Works very well, for me at least.

 

 

NOTE I now post my TRADING ALERTS into my personal FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!

Youtube video: Is it fundamental analysis, quant analysis, or technical analysis for best trading performance

Youtube video: Is it fundamental analysis, quant analysis, or technical analysis for best trading performance

NOTE I now post my TRADING ALERTS into my personal FACEBOOK ACCOUNT and TWITTER. Don't worry as I don't post stupid cat videos or what I eat!