Sunday, November 24, 2013

Artificial Intelligence and Its Use in Making Money

Tokyo Stock Exchange via Dick Johnson


Stock trading. Many people come together at different stock exchanges solely to try their best at buying low and selling high. Stock markets have been in business for approximately eight centuries. Everything was done by hand for the longest time. The process of making computations and very educated guesses was tedious and only meant for the serious mathematicians. Algorithms came in handy, but could still prove slow and imprecise.

The idea of programming an artificial neural network has been a work-in-progress for the past several decades. However, it failed to gain real traction before the introduction of the modern computer. Now that computer processors are faster and more efficient while computing many numbers, stock experts are beginning to try and figure out the best method of implementing neural networks that will work to their advantage. By using networks like those, computers can be "taught" to function similarly to the human brain in stock market scenarios, except with a much greater advantage. For instance, the following statement describes the Neural Fair Value (NFV) System:

"The Neural model is based on "Neural Networks" theory, an artificial intelligence concept designed to replicate the human brain's ability to learn. During a neural model's training period, prediction errors are reduced by adjusting inputs." (via Neuroshell)

Neural networks would be able to "learn" from the past much like a human would, but in a much more efficient process. This includes mistakes or successful results from past problems. By doing this, neural networks can adapt and better predict stock fluctuations.

Sounds like neural networks will be incredibly useful in the future of stock trading. However, the development and research going into them is constantly ongoing - no perfect algorithms have been developed quite yet.


* http://www.neuroshell.com/Successful%20Trading%20Using%20Artificial%20Intelligence.pdf
* http://www.neuroshell.com/traders.asp?task=interviews&id=19
* http://www.f.kth.se/~f98-kny/thesis.pdf
* http://www.businessweek.com/stories/2006-05-07/a-neural-approach-to-the-marketbusinessweek-business-news-stock-market-and-financial-advice

Thursday, November 14, 2013

Why Computer Science has become and will always be the next step in humanity's technological progression



via theverge.com
"Quantum memories capable of storing and retrieving coherent information for extended times at room temperature would enable a host of new technologies." (Science Magazine Research Paper/Journal - link provided below)
My cousin is an Information Technology manager at a somewhat large-scale organization. His office used to be located in a side-closet of the actual closet responsible for housing huge, hot servers. Hot. Servers become very warm since they are being accessed many times in short timespans as well as running nearly nonstop. Because of this, a cooling system is very very necessary in that closet as well as other datacenters. This way systems won't suddenly fry and break down, thereby forcing intensive and important processes offline.

Imagine being able to work alongside these machines in comfort would certainly boost morale at the workplace. What about conforming to more efficient "Green" standards as a result of needing less air conditioning/fans? How to achieve that? Quantum computing seems to be a possible solution. Why merely a possibility? Well, it's still being heavily researched at the time of writing this. The most recent and significant result was being able to store quantum data at room temperature for thirty-nine minutes as opposed to an earlier record of simply two seconds. The data was still stable enough that some of the stored qubits could be read.

Think of the benefits this technological achievement could provide, especially as the intersection between computer technology and the rest of the industrial world continues to occur. For instance, once this technology truly blossoms, more and more servers will be able to remain in close proximity to each other without nearly as much danger of overheating or overuse of power. Clusters of computing machines will be able to run and crunch numbers longer, faster, and safer while remaining much more reliable than ever before.

The old concept of Folding@Home is just one example of what this improved technology could contribute to. Folding@Home (abbreviated FH from now on) basically allows people to set up their computers with a set of software that can use some specified amount of processing power to work with other machines around the world to find cures to health hazards. This short excerpt is taken directly from their website, which is hosted by Stanford University:

"Help Stanford University scientists studying Alzheimer's, Huntington's, Parkinson's, and many cancers by simply running a piece of software on your computer.

The problems we are trying to solve require so many calculations, we ask people to donate their unused computer power to crunch some of the numbers."

Users can set the process to run in the background of their current work or while they're sleeping; all in the name of helping others.


FH isn't the only situation that stands to benefit from this advancement of technology. Professional filmmakers need vast, powerful machines to render the heavy video they produce on a day-to-day basis. But these machines are limited by their weaknesses: running too strongly could cause burnouts and the sizes of the products can be too expansive. Pixar, the famous animation company, often requires extremely effective machines to produce the average feature film that we watch. Monsters University, a recent animated film, is just one example:

In fact, it took Pixar’s server farm about 29 hours to render each frame of Monsters University, the prequel to Monsters Inc., one of the studio’s most beloved films.

Computer Science will always play some significant role in humanity's fight to stay relevant, progressive, and responsible. Quantum computing is only the beginning.

http://www.sciencemag.org/content/342/6160/830
http://folding.stanford.edu/
http://venturebeat.com/2013/04/24/the-making-of-pixars-latest-technological-marvel-monsters-university/

Sunday, November 10, 2013

File Transfer - where should my data live and how should I access it?

There have been a plethora of different ways to transfer data in recent years: floppy disks, Compact Discs, flash drives, Zip drives, E-mail, file-upload websites.

"Well, it's all up in the air." The current hype - cloud networking/storage/etc - really describes that statement. Most people instantly imagine any data transfer associated with a metaphorical "cloud" as being simple, wispy, and almost effortless. As in, someone simply needs to login to some arbitrary service - be it Google Drive, Dropbox, Box, Mega, or another - and just drag-and-drop some files to some location that is easily accessible from many places. These websites have a tendency to provide several ways to access data you or your peers have uploaded to them. For instance, a user might use Secure File Transfer Protocol (SFTP), regular File Transfer Protocol (FTP), or much more user-friendly Hypertext Transfer Protocol (HTTP). SFTP has become increasingly viable in today's cyber security-prone. Take, for example, the process of configuring a new server. Unless the person managing has direct access to the server at all times, there will most definitely come a time when they need to modify important files from a remote connection. However, this can create a very tricky dilemma. "When using FTP it will be possible for a hacker to capture network traffic between your desktop and the web server and so discover these passwords." (ArtSec, "The benefits of SFTP in website security")


Unfortunately much technology users are uneducated in exactly where they should store their data as well as how the storage is even constructed.

So where should data be? We need redundant storage. Servers capable of RAID or RAID-like functionality. On top of that, we also desperately need secure connections and protocols for fetching the data we need. SFTP works quite well, but not very many people know what it is unless they work in a technology-based field.

*https://artsec.com/2011/12/19/the-benefits-of-sftp-in-website-security/

Sunday, November 3, 2013

Is your treasure secure? Data structures and why they can't be taken for granted.

What happens when you need to call someone from your home phone or cell phone? You normally navigate your device's interface to a particular section of sorted data commonly called the "Address Book" (or some similar variation). Any contact information you have stored locally or via a network of some kind (quite possibly a cloud network), is readily accessible. But have you thought about the operation of such a feature?



Data needs to be secure. Most often, security is thought of as being protection of information. However, there is another definition that information technology professionals swear by and the common user-base tends to ignore. A given user knows their information has proper security when it is safe to use in different ways. Data structures can hold databases, some much like the Address Book mentioned earlier. Proper development of tools used for accessing and modifying can be tricky. For instance, data needs to be accessible as quickly as possible - you want to call a specific contact via the address book at least slightly faster than it would take to remember an arbitrary number and dial it. Many algorithms have been invented, analyzed, and updated in order to take advantage of modern technologies.

As such, data structures and associated algorithms are essential to supporting the kinds of services that are vital to modern society. There are a plethora of different uses for these tools.  




Structuring Depth-First Search Algorithms in Haskell (1995)