31 December 2008

Fantasy Football?

This might feel a little Carlin-esque at times, but bear with me.
Rotisserie baseball is an apt and understandable extension of the Great American Pastime. Slow, methodical, numerically based. No matter how many numbers they cram on a TV screen during a football game, no sport is as stat-happy as baseball. Part of the allure of attending a baseball game is filling in your own scorecard as the game progresses, so I've read.
The slow nature of baseball - pitch after pitch of steady rhythm punctuated by the occasional crack of the bat, or roar of the crowd - makes it possible to accurately and completely record the outcome of a game. Even if there's a dramatic run down for a double play in the top of the 6th inning, you're still going to have plenty of time to record who was out, how many runs made it in, and - if you're really good - how many times the pitcher masticated his gum of choice. You'll definitely have time. Baseball is played in parks because you do so much more than watch the guys on the field when you go.
Thank you, Mr. Carlin.
Then we have "Fantasy Football": you take a fast moving sport with a series of dramatic moments punctuated by the occasional TV timeout where so much is going on if you look away from the field (television) for 5 seconds you'll likely have missed the greatest play in the history of football (Yes, yes, there's replay, but you and I both know that's not the same thing) and you boil it down to the numbers.
The closest I ever personally came to playing fantasy football was a weekly game-choosing contest some friends and I participated in for sodas. Because gambling on sports is apparently illegal in Maryland, and we would never do that. Not this little black duck, no sir.
No drama, no tension, no story - The tension of 4th and goal when your team is down by 5 and there's 3 seconds left in the game (and there's a playoff spot riding on this play) is boiled down to whether or not "& Goal" includes enough yards to bring "your"* starting running back over some other guy's RBs yardage for the week. What makes it even better is if the guy you "drafted"* to be "your"* RB is in fact playing for your team's opponent.
Are you kidding me?
Maybe Fantasy Football is for the person without a team, who loves the numbers and the statistics-as-competition, a real rotisserie baseball type who gets withdrawal symptoms after the World Series ends. Fantasy Football is stat-sports methadone.
I love American football (as well as futbol, but that's another essay), and I can understand how some guys are drawn in to Fantasy Football because they love football so damned much that they want anything and everything to do with it. You can bet the guy who owns that Steelers room on that NFL merch commercial has a Fantasy Football team. Probably stacked with Steelers, too. Except for that one position (second wide receiver? Do they have that position in Fantasy Football? No - don't answer that, I don't care) that some other guy "drafted" out from under him.
This is another reason why I couldn't do Fantasy Football (other than the implied fact that it's more boring than a North Sea Oil Platform). My love for the game made me consider playing, but I'd want to draft my team, because they are who I root for. It's the same reason why I didn't win as many sodas as I should have when I was picking games way back when: I always pick my team to win. I'd be disloyal if I didn't choose my team to win, even if they are playing a team they haven't beat since they moved to Baltimore; my logic, such as it is, always being this: Any given Sunday...
I've been thinking as I write, and maybe football is every bit as stat-happy as baseball. Yards per carry, Number of TD receptions, Yards after catch: these numbers define my Sundays from September through January, but only in the context of the NFL team for whom I root, not some fantastic and unnatural conglomeration of Superstars that only play together in formulas and fiction. Football is as much about the average Joe slugging it out in the trenches, the surprise star of the season (how many Fantasy types drafted Flacco or McClain?), as it is about the flashiest receiver who is best at stashing a sharpie in his sock. Fantasy Football denies this about our game, as well as bleeding the weekly drama until all we have left is black and white, maybe in Comic Sans, if you're one of those people.
Honestly, Fantasy Football guys (guys includes all female-type persons), I don't see the point, and tend to toss it into the same mental pile as NASCAR, if slightly higher on the pile. In the interest of not being a complete hypocrite, I won't call for its disappearance from existence, because it does me no harm. I ask you to not try to persuade me otherwise, because I only care enough about it to complain.
Happy New Year!





*I'm verbally winking at you because the fictitious back in question is not yours, and the only thing you drafted is a fairly good sign that you have too much free time.

24 December 2008

Anecdotal Evidence to the Contrary

Headlines like these:

Final holiday push: Empty stores

leave me scratching my head. Where are these empty stores with reduced foot traffic? Why couldn't I find them so I could do my Christmas shopping there?
I finished my shopping this Saturday afternoon. I tried to go counter-clockwise around the mall to get from Sam's to Kohls, and spent 20 minutes sitting in TRAFFIC. At the VALLEY MALL.
I finally bailed and parked at Macy's and went in.
Sure, I guess the economy's bad when I actually consider purchasing items at Macy's (usually a load of over-priced merchandise trading on a parade and a couple of movies), but ultimately that was happenstance because I saw an empty parking space and took it - outside Macy's.
And sure, I was able to proceed clockwise around the mall and get to Kohls in short order, but the Mall itself was packed with people. PEOPLE. buying THINGS.
I always knew Hagerstown was behind the times, but this is ridiculous.

23 December 2008

Creating a Better Linux: Open Source Verification and Validation

I put a lot into this research paper this past semester, and consider it a quantum leap forward in my synthesis of both the contemporary semester's material and the previous Free & Open Source Software (FOSS) research I'd completed in working on my Master's.
I'm actually, as much as applicable, trying to focus my graduate research on FOSS. It's incredbily interesting, and to be honest, a lot of fun to work with, not to mention very capable of getting the job done. The following paper was "A" worthy, according to the professor, but he dinged me to a "B" because I didn't adhere to APA styling as much as I should have. I concede that point. My TAs from undergrad spent a lot of hours beating APA formatting into me, you'd think it would be automatic - but I digress. Here now, pasting from Word (though mostly composed using the FOSS OpenOffice.org Writer) into this little web-logging interface, is my paper for my Software Verification and Validation course:

Running Head: CREATING A BETTER LINUX

Creating a Better Linux:

Open Source Verification and Validation

SWEN 647

November 14, 2008

Robert C. Murray

Abstract

Free and Open Source Software (FOSS) has grown from a 1980s concept and 1990s hobby into a major market force in the present day. There are many FOSS projects that utilize software verification and validation (V&V) in their development processes, but many who either under utilize or fail to employ V&V at all. All FOSS projects should adopt V&V measures to improve their quality. Revolutionizing software development, FOSS has been the stimulus for many new methods of software engineering. FOSS projects have been used educationally to demonstrate this need for V&V, acting as models for V&V classes at the graduate level. As demonstrated by organizations such as Canonical (makers of the Ubuntu Linux distribution), V&V allows a much higher quality FOSS product to be developed. Applying V&V measures to FOSS products will improve the products created, furthering not only the movement, but the potential for profits to be realized from offering services related to the software itself. The success and profitability of FOSS organizations proves that this will be the case.

Outline

I. Introduction

A. What is Free Open Source Software (FOSS)?

B. FOSS needs more verification & validation (V&V)

II. FOSS revolutionizing software development

III. FOSS and V&V in education

IV. Metrics on how V&V improves projects

V. How V&V is used in FOSS

VI. How increased V&V can improve FOSS dev

VII. Discussion

VIII. Conclusions

The movement for free and open source software (FOSS) arguably began in 1983, when Richard Stallman published his manifesto outlining the GNU is Not Unix (GNU) project. Stallman's thinking is that software should be as “free as air” (Stallman, 1983). In the document he considers the commercial sale of software to be a destructive force, and insists that people might only pay to either obtain support for or for the distribution of software. According to GNU, in a perfect world everyone has the right to freely create and modify the software they are using without the danger of violating any licensing agreements.

Almost a decade after the GNU project was begun, Linus Torvalds (a 21 year old from Finland) developed Linux, a FOSS Operating System (OS), as "just a hobby" (Hasan, 2005). Since 1991 Linux has grown from one man's FOSS hobby project into a multi-million dollar industry that comprises dozens of distributions, all free and legal for the downloading (or installing from a friend's disc). Linux is by far not the only FOSS project in the world, but it could certainly be considered one of, if not the largest.

While the primary kernel (or central program) of Linux is still managed by Torvalds, dozens of Linux distributions (known colloquially as "flavors") use this one kernal and they are all managed by different companies, organizations, or single hobbyists much like Torvalds once was. Each of these groups apply their own suites of software, everything from drivers and system software to productivity applications and games. Larger groups include the Red Hat corporation and Canonical; the former has an eponymous distribution, and the latter is the non-profit organization run by South African businessman Mark Shuttleworth that is responsible for Ubuntu Linux, a distribution that growing in popularity around the world.

Many other FOSS projects exist to develop software applications that do everything from serve web pages to playing video games, and a good portion of FOSS projects seek to duplicate or supersede the performance of similar closed source software products. Though the large organizations that exist to create Linux and other FOSS do employ some verification and validation (V&V) strategies in their community based development paradigm, if more projects adopted V&V measures it would greatly increase the efficiency and functionality of their software.

Before we examine how a more broadly adopted V&V effort can improve FOSS, it is important for us to truly understand what FOSS means, and how it has changed the nature of software development. By understanding this we can then appreciate why it is important for V&V to become much more important to many FOSS projects; those that exist, and those that have yet to begin.

FOSS “facilitate[s] competition and open[s] markets as well as innovation to meet new challenges” (Ebert, 2007). When we consider the contemporary software development community, it is not possible to think of it without recognizing and appreciating the contributions made by FOSS. FOSS as both a movement and a market has been a significant and growing component of our community. While in the 1990s the software markets were dominated by proprietary, closes source players like Microsoft, Oracle, and Novell, The opening decade of the twenty-first century has seen the inclusion of the names Red Hat, Mozilla/Firefox, Apache, and Open Office.

As software developers the world over turned a critical eye to the software they used every day to write documents, surf the World Wide Web (the web), even create and maintain the web itself, they realized that the closed source offerings they had been utilizing were either sub-standard, over-priced, both, or sometimes just not what a group of developers thought a piece of software should be. Some developers just wanted to tinker. They came together via the nascent communication medium of the web to collaborate and develop the large body of FOSS that we have at our disposal today. The web is what really made it possible for FOSS to take off by making it possible for developers from Bangalore to Boston to have a globally accessible and temporally neutral repository for communication and code. With this revolutionary communications medium and a hunger to make a better product, FOSS took off.

A product of this new medium was the creation of several new software development paradigms, like incremental development, an agile approach (Ebert, 2007). Developers used these models on their FOSS projects, and eventually brought them into use for their professional software development; likewise are the contributions to software security. The definitively “open source” nature of FOSS means that when a security vulnerability is found, the fix is fast coming because there is no need to wait for the corporate machinations of a software company to identify and release the fix. This rapid approach to development is sometimes a negative point and would be slowed by the implementation of V&V in some projects, but this is an issue that will be discussed later.

FOSS is responsible for 43% of in-house software development in the United States as of 2007 (Ebert, 2007). It is logical to conclude by the absence of news to the contrary (something that the author would have encountered in his daily technology news reading) that this number is either now the same or more likely larger. This growth in presence has forced the closed source companies listed above to step up their game, as it were. Giants like Microsoft continually toy with the idea of open source, throwing the occasional bone to the community, while others like Novell adopt FOSS as a tactic for survival in the marketplace. FOSS has revolutionized software development; and, as one might expect, it has at times become a component of the education process.

Christopher Fuhrman posited that FOSS would be a good source of design problems that software developers would encounter in the real world, thus providing students with a robust environment for source material to practice on legally and without cost (thanks again to GNU et al.)

According to Fuhrman (2007), there are three major problems that must be overcome when teaching first year undergraduates software design. Firstly, they must be made to understand that a recurring problem is actually recurring when it is not always obvious to the student that what appears to be a one-off problem is likely something that is going to be encountered repeatedly. Secondly, one must also convince the student that to determine the proper solution, abstractions are required that will never be actual code, but are later replaced by functioning code. Thirdly is the application of functions in place of the aforementioned abstractions, using the functions from the “problem-ridden design”.

Given that textbook examples are often unconvincing or even contrived, another source for code examples to use in teaching software design were needed by Fuhrman; he settled upon the target-rich environment of FOSS. As indicated above, FOSS is often hit or miss when it comes to quality, something that a more concerted V&V effort would certainly resolve.

Though diplomatic when he writes:

Open-source software is ubiquitous and has established a generally positive reputation in terms of its quality to compete with proprietary products in the areas of software development, software configuration and change management, office automation, databases, Web browsers, etc. However, we consider it in this article as a source of realistic software artifacts, which, because they have been developed by humans, are certain to have design flaws that can be corrected with design patterns.”

Furhman is in fact stating that a good portion of FOSS could use more (or at times any) V&V before it is released by developers for downloading and installing.

Fuhrman tested his hypothesis with a semester of graduate students who completed a series of exercises using FOSS code. Ultimately the students compiled a number of potential improvements for the examples. Improvements that likely would not have been necessary had the FOSS projects been subject to more rigorous V&V before release, more evidence that more V&V is necessary in the FOSS community.

Though anecdotally demonstrative of the need for more V&V in the FOSS community, Fuhrman's study does not address any of the quantitative aspects we would seek to further indicate the herein called for implementation. Admittedly such data were not the purpose for his study, but we do require quantitative support for our hypothesis.

This quantitative support is not as readily available as one might expect. As of 2003 general data regarding FOSS Quality Assurance (QA) activities were rare (Zhao, 2003). However, some data are available. One study performed a survey of FOSS developers via a pair of popular FOSS hosting sites, eliciting 229 usable responses with the projects themselves sorted into three categories: Tiny, having less that 1000 lines of code (LOC); small, with 1000 to 10,000 LOC; medium, 10,000 to 100,000 LOC; and finally large, greater than 100,000 LOC. It is worth noting that often FOSS projects grow in size as they are developed. This is likely due to a lack of Requirements Management resulting in scope creep.

Akin to the lack of overall Requirements Management is the issue of formal documentation. FOSS developers, regardless of type or size of the project, largely ignore the use of formal documentation when developing these projects. They instead stick to simple methods like “TODO” lists or general guidelines. (Zhao, 2003). This is indicated visually in figure 1. However, even without formal documentation of any kind, FOSS projects spend a significant amount of their development time involved in testing. Figure 2 displays the counter-intuitive fact that projects classified as large spend a significantly shorter time in testing than smaller projects. A reminder that this is all largely without a guided V&V effort. The testing that does occur is of a basic nature, most often simply applying inputs that are meant to simulate the users' behavior. This is the most often used test across project sizes.

Projects that invest almost nothing in V&V, instead conducting only basic testing, are also those that are most likely to pass along their bugs to the user. Given the community nature of FOSS, often the “user” is a developer himself and therefore expects to function as a tester or debugger for the person or group developing the project. Some consider this their way of helping out the community. For those projects that spend a short amount of time in testing , the user is certain to find more bugs than if the project had been more thoroughly tested. Figure 3 bears out how more bugs are found when software is tested less. We have already seen that the FOSS projects that are tested the least are in fact the largest projects, and therefore the most likely to 'break out' and gain in popularity among a more generalized population. A population of average computer users who are not necessarily people who consider finding bugs helping out the community; they would rather their software work straight away. The FOSS news websites contain monthly – if not weekly – stories about how Linux or Open Office (a FOSS alternative to Microsoft Office) or some other piece of FOSS are almost ready for “the desktop”, meaning office workers and soccer moms, not just computer nerds and developers. If the facts as presented in figures 2 and 3 persist, then these projects will not ever move beyond “almost” ready for the desktop, and FOSS will remain buried in the server room, or the occasional forward thinking office. The good news is that V&V does exist in some FOSS projects, especially in Linux distributions.

Historically speaking, the Linux kernel was not subject to a disciplined testing regime prior to releasing updates (Thomas, 2003). As with many FOSS projects, a community of developers contributed new features and patches for broken or out-dated features, but testing was ad-hoc at best. Smaller FOSS projects (not necessarily Linux, but other apps) that were created and maintained by only a single user are never subjected to any testing beyond what the creator sees fit to complete. With Linux, at least, discipline regarding the testing regime has been growing since approximately 2001.

The largest portion of the verification piece of what V&V there is in FOSS is made up of code reviews that are conducted by persons who are members of a mailing list and therefore receive notice that new code has been submitted for their approval. The web and the rest of the Internet – mailing lists, Internet Relay Chat (IRC) channels, etc. - are used not only to disseminate the need for a code review, but as described above they are used by geographically disparate developers to suggest new features and even conduct design reviews before commencing coding.

In Validation, the GNU C compiler project has been indicated by several sources as a FOSS project that is unique among FOSS projects in that it adheres to a stricter process of validation for all changes to the code. Of course, with Linux there has never been an existing requirements document to validate against, so the community comes to an agreement on how a new feature will behave before work on that feature begins. This holds true for all new features that are not subject to pre-existing standards documents like POSIX or IETF RFCs (Thomas, 2003).

Linux distributions often enjoy a high level of quality in spite of a seemingly laissez faire attitude towards traditional V&V. Instead, as described above the community steps in to either develop real-time consensus on new feature behavior, or the community provides sufficient testing via sheer numbers in testing. Some Linux distributions like Ubuntu maintain their own QA teams. The Ubuntu team has teams dedicated to bug reporting, testing, V&V, and QA coordination (Canonical, 2008). It bears repeating that this level of V&V is not the norm. With a community of developers and testers supplanting a traditional V&V team, many larger FOSS projects manage to still deliver quality software (Aberdour, 2007). There are still a good number of medium, small, and even large projects that suffer from this lack of V&V.

“Given enough eyeballs, all bugs are shallow,” says Eric Raymond.

Raymond's quote has appeared in several of the source documents that were researched for this paper, and indeed what he refers to as “Linus's Law” (a reference to Linus Torvalds) is an appropriate statement to sum up the general philosophy of FOSS towards software testing and by extension V&V. The movement itself grew organically and was not planned out by any one company or organization, though structured organizations like Canonical or Red Hat do exist within the community and employ V&V techniques. These are primarily found in Linux distributions, as the OS itself has been viewed as a revenue generator and thus worthy of a legitimate V&V effort. The use of V&V is apparent: Red Hat is a popular server OS, and Ubuntu is to be found as an alternative option for computer buyers who don't want Microsoft's Windows on their new PCs, with their popularity traceable to a level of quality due to their efforts at V&V.

This organic growth has also been highly decentralized, meaning that the FOSS community has lacked a firm guiding hand; certainly there have been luminaries and great contributors like Stallman and Torvalds, but among the pantheon of FOSS, there is no one pointed to as the great Project Manager, he who has made sure the community embraced best practices and did the appropriate testing before releasing. There are those in the community who would argue that this is very much the point, that FOSS isn't about V&V or best practices. A Venn diagram would likely illustrate a significant overlap between these people and the people who desperately want Linux and other FOSS to be adopted by everyone.

Whatever the reasoning, the reality is that the ad hoc or organic nature of FOSS development means that scope creep is a real problem; a significant project may be bogged down by an overzealous community that insists that their project needs a number of bells and whistles, seeking to make it that much better than the closed source product it is seeking to duplicate. A sort of overcompensation that almost deserves a more psychological examination.

The FOSS volunteer army of developers might not consider their lack of V&V important for a number of reasons. There is usually no set time frame for the development cycle of a FOSS project, meaning that a team can perform simple testing for as long as they like without any thought for the structure or efficiency of their testing, using the 'release early, release often' philosophy prevalent in FOSS. Technically their project is 'released', but not in a way that any professional organization might consider using. Time to a stable release might actually be abbreviated if these projects adopted V&V efforts!

Most FOSS developers in the community are professional developers by day, working with some level of formal documentation and QA or V&V. These developers are all aware - even if at just an abstract level - of all of the steps in the development life cycle including requirements specification and certainly V&V. Knowing that these procedures and paradigms exist and that they exist for a good reason, why then do most developers eschew them when they choose to take up work on a FOSS project?

Perhaps they are victims of a poor implementation in their professional environments, where they feel like they never get any actual programming accomplished because they are bogged down in meetings or professional politics that have infected the development process. Perhaps they work for an organization that has a highly structured and restrictive development life cycle, and they turn to FOSS projects in their free time as a way to do what they love free of the restrictions of documentation, verification, and validation, without QA departments finding bugs in their code, or supervisors evaluating them on their lines of code output. Perhaps our community of FOSS developers spend their days working with a perfect development structure that consists of logical and maintainable documentation and robust and fair V&V processes, but for whatever reason they consider all this overhead a waste of time that gets in the way of their coding (after all, is not the code itself the documentation?), and therefore contribute to the FOSS community where, as we have seen, a need is perceived and a lot of code is thrown at the loosely defined problem until something sticks. No matter the motivation for a professional developer to work at FOSS and ignore what he knows to be best practices or even basic QA, the fact is that it occurs with most FOSS, and the quality of software developed in the community is not what it could be.

Consider for a moment security. Open source means that the source code to all of the applications developed are available for everyone to view, and by providing the source code to applications, it follows that FOSS security vulnerabilities are more likely to be exposed and corrected in a swift manner. This is something that is not always accomplished by FOSS's closed source cousins who may not become immediately aware of vulnerabilities and when they are aware waste time in the bureaucracy developing an acceptable patch for the vulnerability. We would stipulate that FOSS projects are in a unique position to employ V&V. Using this security example, we see that FOSS is quicker to patch discovered vulnerabilities than closed source systems. By implementing V&V in the development process FOSS projects would eliminate a significant number of these vulnerabilities before the software is made available, and the uniquely balkanized nature of FOSS means that even with V&V in place for software maintenance, vulnerabilities could still be patched quicker than possible by monolithic software companies.
Of course, it has been demonstrated that FOSS is actually of a surprisingly high quality in spite of its lack of V&V. The authors of the source material who point this out are likely considering a second or third major release of a FOSS project. By this point the community will certainly have been able to track down and rectify most of the bugs in the system. Often initial releases of FOSS projects are at best fickle in their function, and at worst destructive to the user's productivity by causing OS crashes or other problems. To make FOSS ready for the desktop, the community must realize that they need to implement V&V processes within their projects. As discussed above most FOSS developers are already familiar with V&V, they are also familiar with building consensus, and would therefore likely soon agree upon a set of V&V processes best suited to the project before them. Those hobbyist developers with no professional experience have already proven themselves to be good learners, and they would therefore soon absorb V&V processes into their own personal style of programming. FOSS seems poised to implement V&V, given that the community itself consists of a large population of people who are already engaged in informal testing of a project. By applying V&V processes to this testing across the majority of FOSS projects (rather than the minority, as it is presently), FOSS will be more stable and reliable when it is made available to the world at large. This will bring FOSS one large step closer to the desktop.

We have seen how free and open source software revolutionized the software development world. The movement has made giants like Microsoft pay close attention, and had other corporations like Novell embrace it as a philosophy for survival. The projects that have done this have embraced some level of QA and V&V into their development projects. Today Linux distributions like Red Hat and Ubuntu are used by companies and home users that might have otherwise deployed Microsoft's Sever or Vista/XP offerings. Open Office and it's sister release Star Office (offered by Sun Microsystems) exist as viable threats to Microsoft's Office suite, having replaced the Microsoft product in some government and private sector offices around the world. We have also seen how many other large projects do not perform significant V&V, or even basic testing, before releasing their software to the public, and that many smaller FOSS projects exist that would likewise benefit from good V&V processes. As it is FOSS is a minor threat to closed source providers. With V&V processes in place in projects tiny to large, we feel that FOSS would finally be ready for the desktop.

Appendix A

Figure 1: Documentation by project topic (Zhao, 2003).


Figure 2: Testing time by project size (Zhao, 2003).


Figure 3: Testing time vs. the percentage of faults found by users (Zhao, 2003).



References

(2008). List of distributions. Linux Online. Retrieved November 14, 2008 from http://www.linux.org/dist/list.html

(2008). Main / HomePage. NSLU2-Linux. Retrieved November 14, 2008 from http://www.nslu2-linux.org/

Aberdour, M.(2007),Achieving quality in open source software. IEEE Software. pp.58-64.

Canonical (2008). Press release archives. Retrieved November 14, 2008 from http://www.ubuntu.com/news/pressreleasearchive

Canonical (2008). QA team – ubuntu wiki. Retrieved November 14, 2008 from https://wiki.ubuntu.com/QATeam

Ebert, C. (2007). Open source drives innovation. IEEE Software. pp.105-109

Furhman, C (2007). Appreciation of software design concerns via open-source tools and projects. Proceedings of the 38th SIGCSE technical symposium on Computer science education pp.454-458

Hasan, R. (2005). History of linux. Retrieved November 14, 2008 from https://netfiles.uiuc.edu/rhasan/linux/

Stallman, R. (1983). Gnu manifesto. Retrieved November 15, 2008 from http://www.gnu.org/gnu/manifesto.html

Thomas, C (2003). Improving verification, validation, and test of the linux kernel: the linux stabilization project. Taking Stock of the Bazaar: Proceedings of the 3rd Workshop on Open Source Software Engineering pp.133-136

Vaughan-Nichols, S. (2005). Red hat's earnings and stock soar. eWeek, Retrieved Nobevmer 14, 2008 from http://www.eweek.com/c/a/Linux-and-Open-Source/Red-Hats-Earnings-and-Stock-Soar/

Zhao, L., Elbaum, S. (2003). Quality assurance under the open source development model. The Journal of Systems and Software 66 pp.65-75

05 December 2008

Dragging Myself Across the Finish Line by my Lips

I'm going to kvetch for a moment. I promise I'll write something better in the coming days. A preview? Zoe likes to eat broccoli - well, broccoli stems. Still in the ground. But presently:
I thought my first semester of Grad school was a real kick in the jimmies, getting my big-boy legs when it came to grad vs. undergrad.
I thought the Summer Semester of 10 weeks where I crammed in 2 grad classes AND designing/developing a new website was a killer, what with the Summer heat and F being an ocean away.
I now know that this semester was the biggest academic kick in the teeth I've had since 10 grade Chemistry. I'd say my Fall semester of my sophomore undergrad year, but no one who reads this likely remembers when I tried to take calculus-based physics, chemistry, and a bunch of other academic courses in the same semester.
Point is, Requirements and Verification & Validation cover a lot of the same concepts from different angles. You'd think that would make it easy. Oy God, let me tell you, I'm not going to be walking right for awhile after finishing this semester.
I just, minutes ago, uploaded my last final, by the way.
Between having 3 projects in one class where I was group leader of a group that preferred to wait until THE LAST POSSIBLE SECOND (not all of them, I guess) before turning in work, and then turning work that was done wrong, and then who couldn't be contacted for revisions, and another class where I swore I understood the material but my mid term and likely now my final will make me look like a drooling moron.
It's stuff like this what keeps me humble.
Did I mention that while I was in the middle of all this academic splendor work went from nothing to HOLY SHIT NEW PROJECT EVERY WEEK? Not that I'm complaining about the work! I'm a. glad to still have my job and b. like web programming, so it's a good thing, but it adds a pile of stress when I'm exhausted at the end of the day and the last thing I want to do is read a bunch of really dry and blurry PDF scans of IEEE standards regarding V&V. And I'm the guy who'll read the back of a shampoo bottle sitting on the can, and really think about what all those ingredients must be. It's not just something to do for me, I'm genuinely curious.
This wonderful moment of catharthis I've decided to share with the globe was meant to be one of joy and elation, where I was winded from all the jumping about I'd done for victory over my semester. Instead, I'll likely go to bed tonight with nightmares about failing the class.
It's more than just the money, it's the personal academic pride. It's largely the money, though. Shit grades get me shit money in reimbursements from my employer. Good news for them in these tough economic times, bad news for my bank account.
In the end, my friends, it's done. There are no more exams to finish or papers to write, no more groups to shepherd to completion, no more weekly conference posts to make.
Until January 2009, when I face the next semester. How hard can Software Maintenance be? I mean, I do it every day, right?

Popular Posts