Sunday, December 15, 2013

What is the greatest prime number of them all?



One of the greatest mathematical challenges and brain twisters of our time is the search for the greatest prime.

Euclid proved that there are an infinite number of prime numbers in his book, The Elements. One might think that people would be satisfied with knowing that the set of prime numbers is not finite and therefore cease their search. However, the hunt lives on today.

The Great Internet Mersenne Prime Search (GIMPS) is an organization seeking out the next larger prime number. To this end, the individuals involved use powerful computers and efficient algorithms to try and calculate it. When a new prime is discovered as a result of the stringent, tough, and interminable process, thorough testing must be done by several sources in order to make sure that it wasn't a fluke find. For instance, there could be a memory leak or some other issue on the computer involved that causes a miscalculation. If something like this happens, it could cause the outputted number to be incorrect. The people who are responsible for contributing to this effort need to be sure that they aren't making non-prime numbers famous.

The equation/proof commonly used in the calculation of Mersenne primes is called the Lucas-Lehmer Test and can be expressed as follows:

For p an odd prime, the Mersenne number 2p-1 is prime if and only if 2p-1 divides S(p-1) where S(n+1) = S(n)2-2, and S(1) = 4

The GIMPS is very similar to another numbers crunching computational process that I actually mentioned in this prior post. Folding@Home hopes to crunch its way to a cure for serious illnesses and diseases.

The Elements of Euclid
* Mersenne
* Great Internet Mersenne Prime Search

Sunday, December 8, 2013

Computer Graphics

"A picture is worth a thousand words."

This idea has been circulating for just over a century. It was said at a time when newspapers were the norm, not computers. What does it mean for us in our now technologically advanced society (relative to the early twentieth century)?

We've come to a time when graphics absolutely must be used in order to tell some kind of story. But it goes much deeper than that. Take for instance, a video game. Video games have, in recent years, used special imaging in order to capture the motion of a real person and then paint over polygons with a face. However, this was limited by the power of the machinery behind it. Now we can actually create a model of a head using modern computational powers.




Our technology has surpassed barriers that we used to think permanent. We can now sculpt human heads that look quite eerily identical to a real person's.

The most important thing to remember is we were further than this just a couple of centuries ago. In the late 80s and 90s, we would use the Bresenham line algorithm in computer programs. This algorithm is  used mostly for plotters and thusly the creation of vector graphics. However, its contribution to the world of computer generated graphics shows how fast technology is moving.

http://www.idav.ucdavis.edu/education/GraphicsNotes/Bresenhams-Algorithm.pdf

Sunday, December 1, 2013

Internet Message Access Protocol and Post Office Protocol



Internet Message Access Protocol (IMAP) and Post Office Protocol (POP) are the mainstream protocols that most e-mail services use currently. Both have fairly significant differences. Why does it matter how your client is set up? See, you can setup, say, your smartphone's e-mail application to either use POP or IMAP.

IMAP almost always stores the inbox itself (consisting of all emails received) on the central e-mail server. One benefit of this service is that all mail is backed up frequently. In the case that there is some kind of large-scale crash, your data is preserved in more than one server location. Once the outage has been fixed, then the e-mail attached to your account will return to your regular inbox as per usual operation. IMAP employs the use of ciphers, specifically for MD5 and SHA hashes for connections between client and server. If a match fails authentication, then a user must intervene to provide some form of confirmation or risk the connection itself being severed and labeled as suspect. A possible issue affecting this security process is that of a rough connection. Perhaps some packets are lost in transit, thereby weakening the security on one end, which would make authentication that much more complicated.

POP works fairly differently than IMAP. E-mail is stored initially on a server and later saved relatively permanently on the computer being used. However, as a result, the messages will almost always no longer be stored on the server indefinitely unless configured in a special way. So there will be no way to resurrect lost data unless the user has been keeping track. POP uses specific commands with shared secrets for security. Secrets can cause accessing private data to become more complex.

Security-wise

http://tools.ietf.org/html/rfc3501
http://tools.ietf.org/html/rfc5321
http://tools.ietf.org/html/rfc1939

Sunday, November 24, 2013

Artificial Intelligence and Its Use in Making Money

Tokyo Stock Exchange via Dick Johnson


Stock trading. Many people come together at different stock exchanges solely to try their best at buying low and selling high. Stock markets have been in business for approximately eight centuries. Everything was done by hand for the longest time. The process of making computations and very educated guesses was tedious and only meant for the serious mathematicians. Algorithms came in handy, but could still prove slow and imprecise.

The idea of programming an artificial neural network has been a work-in-progress for the past several decades. However, it failed to gain real traction before the introduction of the modern computer. Now that computer processors are faster and more efficient while computing many numbers, stock experts are beginning to try and figure out the best method of implementing neural networks that will work to their advantage. By using networks like those, computers can be "taught" to function similarly to the human brain in stock market scenarios, except with a much greater advantage. For instance, the following statement describes the Neural Fair Value (NFV) System:

"The Neural model is based on "Neural Networks" theory, an artificial intelligence concept designed to replicate the human brain's ability to learn. During a neural model's training period, prediction errors are reduced by adjusting inputs." (via Neuroshell)

Neural networks would be able to "learn" from the past much like a human would, but in a much more efficient process. This includes mistakes or successful results from past problems. By doing this, neural networks can adapt and better predict stock fluctuations.

Sounds like neural networks will be incredibly useful in the future of stock trading. However, the development and research going into them is constantly ongoing - no perfect algorithms have been developed quite yet.


* http://www.neuroshell.com/Successful%20Trading%20Using%20Artificial%20Intelligence.pdf
* http://www.neuroshell.com/traders.asp?task=interviews&id=19
* http://www.f.kth.se/~f98-kny/thesis.pdf
* http://www.businessweek.com/stories/2006-05-07/a-neural-approach-to-the-marketbusinessweek-business-news-stock-market-and-financial-advice

Thursday, November 14, 2013

Why Computer Science has become and will always be the next step in humanity's technological progression



via theverge.com
"Quantum memories capable of storing and retrieving coherent information for extended times at room temperature would enable a host of new technologies." (Science Magazine Research Paper/Journal - link provided below)
My cousin is an Information Technology manager at a somewhat large-scale organization. His office used to be located in a side-closet of the actual closet responsible for housing huge, hot servers. Hot. Servers become very warm since they are being accessed many times in short timespans as well as running nearly nonstop. Because of this, a cooling system is very very necessary in that closet as well as other datacenters. This way systems won't suddenly fry and break down, thereby forcing intensive and important processes offline.

Imagine being able to work alongside these machines in comfort would certainly boost morale at the workplace. What about conforming to more efficient "Green" standards as a result of needing less air conditioning/fans? How to achieve that? Quantum computing seems to be a possible solution. Why merely a possibility? Well, it's still being heavily researched at the time of writing this. The most recent and significant result was being able to store quantum data at room temperature for thirty-nine minutes as opposed to an earlier record of simply two seconds. The data was still stable enough that some of the stored qubits could be read.

Think of the benefits this technological achievement could provide, especially as the intersection between computer technology and the rest of the industrial world continues to occur. For instance, once this technology truly blossoms, more and more servers will be able to remain in close proximity to each other without nearly as much danger of overheating or overuse of power. Clusters of computing machines will be able to run and crunch numbers longer, faster, and safer while remaining much more reliable than ever before.

The old concept of Folding@Home is just one example of what this improved technology could contribute to. Folding@Home (abbreviated FH from now on) basically allows people to set up their computers with a set of software that can use some specified amount of processing power to work with other machines around the world to find cures to health hazards. This short excerpt is taken directly from their website, which is hosted by Stanford University:

"Help Stanford University scientists studying Alzheimer's, Huntington's, Parkinson's, and many cancers by simply running a piece of software on your computer.

The problems we are trying to solve require so many calculations, we ask people to donate their unused computer power to crunch some of the numbers."

Users can set the process to run in the background of their current work or while they're sleeping; all in the name of helping others.


FH isn't the only situation that stands to benefit from this advancement of technology. Professional filmmakers need vast, powerful machines to render the heavy video they produce on a day-to-day basis. But these machines are limited by their weaknesses: running too strongly could cause burnouts and the sizes of the products can be too expansive. Pixar, the famous animation company, often requires extremely effective machines to produce the average feature film that we watch. Monsters University, a recent animated film, is just one example:

In fact, it took Pixar’s server farm about 29 hours to render each frame of Monsters University, the prequel to Monsters Inc., one of the studio’s most beloved films.

Computer Science will always play some significant role in humanity's fight to stay relevant, progressive, and responsible. Quantum computing is only the beginning.

http://www.sciencemag.org/content/342/6160/830
http://folding.stanford.edu/
http://venturebeat.com/2013/04/24/the-making-of-pixars-latest-technological-marvel-monsters-university/

Sunday, November 10, 2013

File Transfer - where should my data live and how should I access it?

There have been a plethora of different ways to transfer data in recent years: floppy disks, Compact Discs, flash drives, Zip drives, E-mail, file-upload websites.

"Well, it's all up in the air." The current hype - cloud networking/storage/etc - really describes that statement. Most people instantly imagine any data transfer associated with a metaphorical "cloud" as being simple, wispy, and almost effortless. As in, someone simply needs to login to some arbitrary service - be it Google Drive, Dropbox, Box, Mega, or another - and just drag-and-drop some files to some location that is easily accessible from many places. These websites have a tendency to provide several ways to access data you or your peers have uploaded to them. For instance, a user might use Secure File Transfer Protocol (SFTP), regular File Transfer Protocol (FTP), or much more user-friendly Hypertext Transfer Protocol (HTTP). SFTP has become increasingly viable in today's cyber security-prone. Take, for example, the process of configuring a new server. Unless the person managing has direct access to the server at all times, there will most definitely come a time when they need to modify important files from a remote connection. However, this can create a very tricky dilemma. "When using FTP it will be possible for a hacker to capture network traffic between your desktop and the web server and so discover these passwords." (ArtSec, "The benefits of SFTP in website security")


Unfortunately much technology users are uneducated in exactly where they should store their data as well as how the storage is even constructed.

So where should data be? We need redundant storage. Servers capable of RAID or RAID-like functionality. On top of that, we also desperately need secure connections and protocols for fetching the data we need. SFTP works quite well, but not very many people know what it is unless they work in a technology-based field.

*https://artsec.com/2011/12/19/the-benefits-of-sftp-in-website-security/

Sunday, November 3, 2013

Is your treasure secure? Data structures and why they can't be taken for granted.

What happens when you need to call someone from your home phone or cell phone? You normally navigate your device's interface to a particular section of sorted data commonly called the "Address Book" (or some similar variation). Any contact information you have stored locally or via a network of some kind (quite possibly a cloud network), is readily accessible. But have you thought about the operation of such a feature?



Data needs to be secure. Most often, security is thought of as being protection of information. However, there is another definition that information technology professionals swear by and the common user-base tends to ignore. A given user knows their information has proper security when it is safe to use in different ways. Data structures can hold databases, some much like the Address Book mentioned earlier. Proper development of tools used for accessing and modifying can be tricky. For instance, data needs to be accessible as quickly as possible - you want to call a specific contact via the address book at least slightly faster than it would take to remember an arbitrary number and dial it. Many algorithms have been invented, analyzed, and updated in order to take advantage of modern technologies.

As such, data structures and associated algorithms are essential to supporting the kinds of services that are vital to modern society. There are a plethora of different uses for these tools.  




Structuring Depth-First Search Algorithms in Haskell (1995)