What I’ve been working on the last ten weeks

Ok, so this is going to be really boring for almost everyone… but I am adding it just as a bit of a diary for myself…

 

Tiny bits of information, 0’s and 1’s, coursing through the veins of a mass amalgamation of wires and routers and computers and pupils and into brains. The world of information technology is indeed a marvelous place to become lost and wander. And yet, what lurks behind the monitors and CPU chassis, beyond the insulated blue covering of the cat 5e is the world of mathematics.

In the first week of our discrete math this spring quarter we began discussing algorithmic efficiencies. The goal was to answer the question of what makes one algorithm more efficient than another. To answer this question we studied various ways to compare the complexity and number of steps necessary to complete a computation.

We found that even in today’s world of memory that is measured in gigabits and with tiny nano processors still operating at millions of instructions per second, these operations still take time, and money; and while computers are growing faster, smaller and more powerful, the things we are trying to do with them become more complex and intriguing thus requiring even today’s chip and software designers to be cognizant of operational efficiency.

In chapter two we discussed different types of relations and functions and inductive proofs, laying the foundation for future topics around set theory and proving mathematical statements even when dealing with possible infinitives, like for example, how do we know that n2 is always less than 2n even if we don’t have the computational power and lifespan to execute this algorithm against all possible n’s. Again, knowing that computer processing is still limited to finite computations, this concept of dealing with sets in a finite manner, even when looking to solve problems that fringe on the infinite becomes very important.

Moving onto chapters four, five, six and seven (yes, for some reason we skipped chapter three on cryptography which would have been very exciting!) we began to discuss a collection of related vertices called graphs and networks. Within these chapters analyzed how to build graphs out of connected vertices, and how to analyze graphs for circuits and paths and determine the shortest paths from any given point on a connected graph. We discussed special graphs called trees, and examined different types of trees like rooted and binary trees. Once we analyzed various types of connected graphs and trees we discussed algorithmic ways to analyze the connectedness of these graphs, and learned to understand ways to match up different connected points on a graph (or tree) in the most optimal ways.

Again, discussing the need to remain efficient and small, all of these concepts surrounding graphs and matching and efficient paths between connected points are very important within the field of information technology, and the world itself. These techniques can be used for various things such as trying to find the most efficient way to get water to masses of people, trying to find the quickest and cheapest and most efficient route from point A to point B, trying to prioritize delivery of data packets and speed of delivery across a communication network, and the list goes on and on.

Chapter eight continued within the thoughts of set theory and matching. It expanded on the fundamentals of combinatorics and permutations, providing an understanding of how one can use mathematical algorithms to determine the matching and ordering capabilities within sets of values.

Chapter eight further led into a discussion of iterations within chapter nine consisting of details around functions being called recursively to display cumulative values such as compound interest. Iterations such as the Fibonacci recurrence were discussed, and we examined first-order linear difference and second-order homogeneous linear difference equations with constant coefficients. The purpose of this discussion was to once again go back and understand how algorithms with very large values or potentially infinite input and output can be executed within a finite state with the least number of functional operations.

And then we came to the final chapter: chapter ten. During chapter ten we began to discuss what interests me the most in the whole conversation of discrete mathematics: Finite state machines. We examined logic gates and integrated circuitry, bringing the discussions of algorithmic efficiencies from the ethereal world of non-tangible algorithms to building real world circuits at the hardware level.

During this ten week course, I chose to produce all of my weekly assignments in bits and bytes, utilizing Microsoft C# (a high level interpreted language) to produce input / output sequences understandable and interpretable by human eyes. While some weeks were more challenging than others, each week always presented itself with some new twist to try and understand how to represent some human defined problem in a way that circuits and numbers could operate on and still produce a meaningful output.

While none of my assignments required writing code efficient and stable enough to sustain life (like a ventilator or respirator apparatus) it was still often challenging in trying to produce the optimal output in the minimal number of steps, especially when required to present 3D type representations (like graphs and trees) in 2D technologies like bit streams and bytes. Additionally, there were some challenges to overcome when being faced with the limits of the size of numerical representation on a 64bit operating platform.

In closing, I have compiled a final project that presents in a single user interface of all functions and routines that I created throughout the ten week course. This course has provided the benefit of continuing to broaden my understanding of the fundamental concepts behind computational theory and technological efficiencies.

Project Files

Exit mobile version