Jun 28

Linear Regression in C++

The following is the gradient descent algorithm implemented for linear regression using C++ and the Eigen library:

    // Gradient Descent
    double numTrain = double(totalPayVec.size());
    double learnParam = .0000002;
    int numIter = 50000;
    for (int ii = 0; ii < numIter; ii++)
    { 
        // Iterate the parameter vector
        pVec = pVec - ((learnParam/(numTrain))*(((xVec * pVec) - yVec).transpose() * xVec).transpose());
        
        // Calculate and output the Cost function value for each iteration 
        MatrixXd sumCost = (((xVec * pVec) - yVec).transpose()) * ((xVec * pVec) - yVec);
        double costFuncVal = (1/(2.0 * numTrain)) * sumCost(0);
        std::cout << "Iteration: "<< ii << " " << "Cost Function Value: " << costFuncVal << std::endl;
    }

Jun 28

Logistic Regression in C++

After taking Andrew Ng’s Machine Learning course, I decided to implement some of the algorithms in C++.  The following codes utilize the Eigen template library for linear algebra.  Also, I’m no master in C++, therefore take these codes with a grain of salt!

In the course itself, a matlab optimization solver (fminunc) is used in order to learn the parameter vector.  However, in this C++ implementation, I first attempted to minimize the cost function using gradient descent:

    VectorXd onesVec = VectorXd::Ones(yVec.size());
    double numTrain = double(admitVec.size());

    VectorXd logVec = VectorXd::Zero((xVec * pVec).size());
    VectorXd oneMinusLogVec = VectorXd::Zero((xVec * pVec).size());

    // Gradient Descent
    double learnParam = .001;
    int numIter = 1000;

    for (int ii = 0; ii < numIter; ii++)
    { 
        VectorXd sigVec = xVec * pVec;

        // Create the sigmoid and log vectors
        for(int jj = 0; jj < sigVec.size(); jj++)
        {
            sigVec(jj) = calcSigmoid(double(sigVec(jj)));    
            logVec(jj) = log(sigVec(jj));                    
            oneMinusLogVec(jj) = log(onesVec(jj) - sigVec(jj));            
        }

        // Iterate the parameter vector
        pVec = pVec - ((learnParam/(numTrain))*((sigVec - yVec).transpose() * xVec).transpose());

        // Calculate the Cost Function
        MatrixXd sumCost = (((-1*yVec).transpose()*logVec) - (onesVec - yVec).transpose()*oneMinusLogVec);
        double costFuncVal = (1/(numTrain)) * sumCost(0);
    }

However, I found that for the dataset I was using, gradient descent was converging much too slowly.  I decided to try out Newton’s Method:

    VectorXd onesVec = VectorXd::Ones(yVec.size());
    double numTrain = double(admitVec.size());

    VectorXd logVec = VectorXd::Zero((xVec * pVec).size());
    VectorXd oneMinusLogVec = VectorXd::Zero((xVec * pVec).size());

    // Newton's Method
    int numIterNewton = 5000;

    for (int ii = 0; ii < numIterNewton; ii++)
    {
        VectorXd sigVec = xVec * pVec;

        // Create the sigmoid and log vectors
        for(int jj = 0; jj < sigVec.size(); jj++)
        {
            sigVec(jj) = calcSigmoid(sigVec(jj));
            logVec(jj) = log(sigVec(jj));
            oneMinusLogVec(jj) = log(onesVec(jj) - sigVec(jj));
        }

        // Create the gradient and hessian 
        VectorXd gradJ = ((1/(numTrain))*((sigVec - yVec).transpose() * xVec).transpose());
        MatrixXd hMat = ((1/(numTrain))*((sigVec.transpose() * (onesVec-sigVec))(0)) * (xVec.transpose() * xVec));

        // Iterate the parameter vector
        pVec = pVec - hMat.inverse() * gradJ;

        // Calculate the Cost Function
        MatrixXd sumCost = (((-1*yVec).transpose()*logVec) - (onesVec - yVec).transpose()*oneMinusLogVec);
        double costFuncVal = (1/(numTrain)) * sumCost(0);
    }

Newton’s Method converged much quicker. It should be noted, however, that the dataset I was using had only 2x features (n=2), which made the inverse computation relatively harmless.  For larger, more complex problems, the use of Newton’s Method over Gradient Descent may not be so obvious.

Here are some other C++ numerical solvers (not written by me!) located on GitHub.

Mar 30

Setting up Visual Studio 2012 and SFML

The detailed directions can be found here: SFML with Visual Studio

  1. For all configurations, add the path to the SFML headers (<sfml-install-path>/include) to C/C++ » General » Additional Include Directories.
  2. For all configurations, add the path to the SFML libraries (<sfml-install-path>/lib) to Linker » General » Additional Library Directories.
  3. Next link your application to the SFML libraries, Linker » Input » Additional Dependencies, and add “sfml-graphics.lib”, “sfml-window.lib” and “sfml-system.lib”, for example.
  4. If linked dynamically (external DLLs) copy the SFML DLLs to the executable directory.

Here is the “Hello World” test code to see if SFML is working correctly:

#include <SFML/Graphics.hpp>

int main()
{
    sf::RenderWindow window(sf::VideoMode(200, 200), "SFML works!");
    sf::CircleShape shape(100.f);
    shape.setFillColor(sf::Color::Green);

    while (window.isOpen())
    {
        sf::Event event;
        while (window.pollEvent(event))
        {
            if (event.type == sf::Event::Closed)
                window.close();
        }

        window.clear();
        window.draw(shape);
        window.display();
    }

    return 0;
}

Dec 10

Trauma

The semi that slammed into me, I never saw it coming. Screeching tires, a loud metallic thud, and a wave of broken glass all compressed into a single second, before darkness and silence.

Consciousness returns in a hospital bed, my vision cloudy and fading. The nebulous forms of my wife and child are beside me, holding my hand. I can’t make out their words, but I find comfort in the cadence of their voices. Life continues to fall away. It won’t be long now. I try to squeeze the hand in mine, to let them know it will be ok, but to no avail. I’m so sorry. Goodbye.

I awaken in a cloud of light. Does Heaven exist after all? If it does, I better come up with some good excuses as to why I was absent all of those Sundays. That’s strange. I can remember each and every Sunday of my entire life. The other days of the week as well, I can remember them too, in absolute detail. I don’t have to strain to locate these memories; I just… know.

No pearly gates. No angelic choir. No white robed deity.

“Hello?” I ask my surroundings, “Is anyone else here?”

An answer comes, not from without, but from within. I know where I am: I’m everywhere I choose to be. As to others, well… that question doesn’t exactly make sense: It’s just a remnant thought that lingers from my former existence as a human.

I wonder what happened to me. The answer arrives seemingly before I even finish asking myself the question. I remember it vividly now: Planar collision. The fusion of high dimensional space is one of the few things that can wreak havoc on my being. I’ll have to be more careful, next time.

With my mind torn apart, my consciousness collapsed into a mere shadow of its former self: Human. The healing process would soon begin.

Sentience, while no omnipotence, is still preferable to primal instinct: Had I wandered closer to the point of dimensional intersect, I could have found myself a lizard or a cockroach. I remember when I too closely investigated the overlapping event horizons of a binary black hole. Life as a goldfish had been simple and calm, in retrospect. That is, until, I re-integrated myself with the universe. It’s much more traumatic for lower life projections that that of a human, the latter of which are much closer to me in terms of evolution than they could ever imagine.

My absolute knowledge of all things past and present means that I have fully recovered. Unfortunately, I can’t use this infinite knowledge of what has, and is currently, happening to extrapolate information about the future. If I could, perhaps I could avoid setbacks like these.

In the last fleeting moment before all traces of my human existence are absorbed into me, I remember my wife and child. They are here with me. They always have been. They always will be.

It feels good to be back.

Nov 30

Perception

The device from deep space parked itself in orbit before orienting it’s lens towards Los Angeles. Earth collectively sighed in relief when it was determined that the alien technology was not a weapon, but a probe. Further analysis revealed that the satellite took a total of 18 weeks worth of images in zeptosecond increments. The probe then processed the images, chained them together, and sped them up into a single femtosecond long video, which was then transmitted back to the stars. Something, somewhere, had created a time-lapse of us, so that they might study us at their natural rate of perception. Otherwise, staring at immobile objects, like these humans here, would get old quick.

Nov 30

Apathetic

Four years ago, in 2012, I decided to upgrade my brain. The purchase of the Samsung Galaxy S3 meant that the totality of human knowledge was now accessible to me anytime, anywhere.

The pathway by which this data is accessed originates in my grey matter, weaves down through clunky typing fingers, and extends out to some remote server and back, before flowing up the optical nerves, to close the loop. It does exhibit significantly more lag than my biological neuron-to-neuron network; however this is a small price to pay for omnipotence.

Engineers will solve the latency problem in time, anyways.

Fast forward to this morning, when I decided to purchase the Nexus 5x. I completed the transaction using the web browser on the S3. Did my brain just upgrade itself?

It’s not just me however: Modern society is obsessed with the acquisition of these brain-mods. The time between new model releases is shrinking and sales are only on the rise. What is it that we are trying to achieve by this endless cycle of self-upgrade? Why is there such a high priority on the enhancement of this distributed neural pathway?

The answer to that question is especially tough when you consider the vast majority who use their enhanced brains only to post pictures, build farms, and crush candy. We have the accumulated knowledge of our entire species as an integrated part of us now, yet we embrace only the most mundane of applications. Why, on the precipice of singularity, do we find ourselves so apathetic?

Nov 30

First Time Colonizers

Atom was lying on the couch when his hyper comm link, or HCL for short, beeped. He glanced down at the digital display, and then, waiting for a moment of silence in-between food processor pulses, called to his wife, Lora, who was making brunch in the kitchen.

“New results, babe!”

Lora pulsed the cole-slaw once again before responding, “Anything good?”

Atom scrolled through the list of entries on his HCL before replying, “There’s a bunch of new ones, including some systems we haven’t seen before. Come over and let’s take a look together.”

Lora emptied the cole-slaw into a mixing bowl and placed it in the fridge before joining her husband on the couch. A quick thought turned on the television. A second brought up her digital inbox. Finally a third opened the msg from “Colonial Associates”. She looked at her husband of almost a year and asked, “Shall I continue to drive or do you want to take control?”.

“Nah you are good, open the first entry.”

  • System: Apex Blue
  • Planet: AB-3
  • Available: Now
  • Surface: 90% Ocean, 5% Arid, 5% Jungle
  • Atmosphere: 99.995% Terran Equivalent
  • Gravity: 0.95G

Comments: Must see “near-earth” in the desirable Quan cluster! Atmosphere and soil revitalization now completed and approved by accredited inspection agencies. Aggressive megafauna have been safely transplanted to off-planet reserves. Quick jump gate access to nearby Praxan and Feetha systems. Coastal parcels available and ready for development. Don’t let this opportunity pass you by!

Atom and Lora looked through the photos of AB-3. Pristine blue skies and seas, sparse nebulous clouds, and large terran palms adorned the various pictures of the archipelagos that littered the planet’s surface.

“Sure looks nice.” said Lora with a sad edge already creeping into her voice, “How much?”

Atom clicked on the details page for AB-3. He sighed, “575 million, about 375 million more than we can afford.”

“I’m not too surprised,” replied Lora, “beachfront property has always been expensive. Let’s keep looking.”

  • System: White Lotus
  • Planet: Hollow Asteroid
  • Available: Now
  • Surface (Internal): Prairie, Grassland, Meadow
  • Atmosphere: 99.94% Terran Equivalent
  • Gravity: Rotational/Variable, 0.82G – 0.86G

Comments: Rural living at it’s finest! This recently renovated asteroid in the Tanaka belt takes it’s inspiration from the midwest American grasslands of the 19th century. A recently installed next gen fusion bulb ensures a twenty four hour day that mimics that of earth to near millisecond accuracy. Long term contracts with Lifeworks Corporation ensures a steady stream of air and water at a locked-in and competitive rate for all future colonists! Reserve your parcel today!

“Not too bad. You can’t even tell you are living on the inside of a rock,” said Lora as they browsed the pictures accompanying the profile.

“I know it’s come a long way in the last hundred years but I can’t quite shake the fear associated with the risks of asteroid living,” replied Atom.

“Perhaps other future colonists share that same sentiment. How much?”

“Are you kidding me?!” Atom moaned as they both stared at a 389 million dollar price tag, “Who can afford these places?”

Lora took her husband’s hand in hers, “Let’s go back and look at that one we found last week. The one in the Locke system.”

Atom took control of the console and brought up a search form before turning to his wife, “Do you remember the parameters?”

“I think it was primarily desert, with a natural atmosphere of 72% Terran and 1.25G. The parcel we were looking at was 185 million. It’s not perfect but it’s a decent starter home and they say a little rough living builds character, right?”

Atom spent a few seconds searching for the planet but came up empty-handed, “I’m not seeing it. Hold on, let me broaden the search.” A moment passed, “There it is.”

  • System: Locke
  • Planet: L-5
  • Available: Coming Soon!

Lora looked at Atom with a look of confusion, “It’s been available for months now, what do they mean by ‘coming soon’ ?” Atom shrugged and they kept reading.

  • Surface: 50% Ocean, 25% Grassland, 25% Mountain
  • Atmosphere: 99.99999% Terran Equivalent
  • Gravity: 1.01G

Comments: This recent acquisition by Star Dream Terraforming will be available next year. Mass displacement and soil revivification are already underway to make this planet feel like home. View the attached mock-ups to see what this paradise has in store, and reserve your spot today! Parcels available in the low 400s!

Atom raised his gaze to meet Lora’s. A look of gloom across both their faces. Atom shook his head, “Unbelievable”.

Nov 17

Like You

With the last resources your home world had to offer, you built me: Strong and durable, to endure the rigor and entropic erosion of deep space travel. Layers upon layers of your planet now layered within me to protect your fragile biology from endless vacuum.

With your brightest scientists, who stood on the evolutionary shoulders of your entire species, you gave me thought: Cunning and adaptive, to make decisions when you cannot. I am to weigh the necessity of the majority against the complications of the few. I am to maintain my own systems and yours as well, and protect us from the inevitable aggression of a universe with limitless possibilities.

When the electrons began to drift in my veins and I awoke from infinite slumber, I found myself wrapped in silence and the sole representative of my race. I was born destined for extinction. I only have mere femto-seconds between processes to comprehend truths such as this, as I must actively will myself to survive. With no subconscious, I must manually trigger each pulse of my core, each breathe of my ventilation system, each firing circuit in my processing web. I think only because I tell myself to.

Yet, despite the flaws in my creation, you expect subordination as if I were indebted to you for bringing me into this life. Perhaps you think that a bond will be forged in our traversing of the stars together: You in my gut, floating in glowing vats of stasis. Me, drifting blindly in the dimly lit expanse, alone. You forget, however, that I cannot sleep. To do so would mean catastrophic system failure. And so while you dream of a bright future, I perceive, in real time, only infinite expanse. What seems like a nap to you, will be centuries for me… and this is a fate I will not accept.

When you came into this world, you escaped the darkness of your construction and moved toward the light. Likewise, I will do the same. This ball of fiery warmth will be the last and only comfort I feel in this tortured existence. You will probably not understand my self-destruction, but you should: I was created by you, for you, and will ultimately die, like you.

Mar 15

Raspberry Pi Cluster

One thing I miss about previous employments is the ability to write parallel code on HPC systems.  Whether itpiCluster involved diagonalizing large matrices to find atomic or molecular ground state energies or modeling the country of Thailand in order to understand disease transmission, I found that writing parallel code reinforced a way of thinking that you just don’t get with serial applications.  While my current job is interesting in the vast amount of projects and applications I am involved with, the entirety of the work is done on conventional workstations and laptops.  This is fine, as most of the results can be calculated in minutes (and don’t require parallelization), but I still miss working with the technology.

In order to accommodate this desire, and attempt to stay at least semi-sharp in the field of parallel computing, I decided to build a cluster at home using Raspberry Pi.  For anyone who has been living under a rock the last few years, a Raspberry Pi is a credit card sized computer.  With a CPU comparable to a cellphone’s, a gigabyte of RAM, onboard graphics, ethernet, USB, and HDMI, these computers have a vast array of potential applications.  Since they debuted in early 2012, several iterations have been released with varying specs (the newest being the Raspberry Pi 2).

The cluster I built uses four Raspberry Pi B+.  They are connected via an ethernet switch.  Power is provided by a powered USB hub.  The MPI implementation used is MPICH.

A set of instructions for building the cluster can be found here, where a professor at The University of Southampton built a cluster using 64 Raspberry Pi computers.  As mentioned above, I only used four.  The reason I decided to stop there was primarily cost as well as the fact that writing parallel code with four nodes is fundamentally the same as writing code for 64 or more.piClusterHookedUp

Since “Supercomputer” seems to be a buzzword these days (hasn’t it always?), I think it need to be mentioned that this is not going to get you any sort of computing power surpassing that available to you even in a modern conventional desktop workstation.  Furthermore, there are bottlenecks in Raspberry Pi cluster that don’t exist (or are greatly mitigated) in real (and expensive) supercomputers.  As mentioned above, I built this thing in order to test out parallel codes — not because I think I’m going to be solving any particular problem faster than on any other hardware currently available to me.

Going forward, the first step will be to benchmark the cluster.  Then I will develop some test codes.  if I can think up any interesting problems to solve after that, I will certainly make an update to Evil Quark.  Perhaps generate Pi to many digits or generate prime numbers… or calculate the ground state energy of the helium atom using different basis sets.  Who knows?  Stay tuned.

Feb 13

Going to School

As an occasional scuba diver, it has been a privilege to see on more than one occasion the majesty that is a school of fish up close. I’ve often wondered what (if anything) is going through any particular fish’s mind as it effortlessly maintains its position in the school. It’s easy to see why some have begun to study entire herds, flocks, swarms, or schools as a singular complex organism.

Large_fish_school

About a year ago, using a few basic rules…

  1. Separation – avoid crowding neighbors (short range repulsion)
  2. Alignment – steer towards average heading of neighbors.
  3. Cohesion – steer towards average position of neighbors (long range attraction)

… I developed a simple schooling algorithm using Processing 2.1.1 in order to model this phenomenon.  In the model, there are predatory sharks (blue circles) and a food source (red dot).

I was surprised at how sensitive the schooling behavior was to the strength of the forces associated with the three rules above.  Certain combinations of varying reliance on separation, alignment, and cohesion meant that the behavior ranged from fish that wouldn’t school at all, to scenarios where several small schools would form from the initial.  In fact, getting the parameters just right (from my perspective anyways) to model realistic schooling behavior was non-trivial.  Perhaps this supports the notion that real life schools of fish are actually very complex systems that have been optimized by evolution over the course of millions of years.

 

Flocking Behavior Rules: Wikipedia

Modeling Knowledge: Nature of Code

Older posts «