Translate To Preferred Language

Search Obioku's Thoughts

Please Read Today's Featured Post

Things I Would Like To Happen Soon

For all those who believe in speaking ideas into existence and prayer in the universe will return to a person.  I simply want to write a lis...

Wednesday, December 21, 2016

Testing Document for Software Engineering

Testing Overview
Testing was performed in the style of alpha white-box testing.  It was conducted from the perspective of the coder with prior knowledge of the system.  Unit testing was done on individual web pages.  Integration testing was from the cross referencing web pages and foreign keys attributes for records.  After each one was conducted, a complete overview of the site as a whole was the last step of system testing.  Testing was based on the requirements the group presented in the design and analysis phases.  Scaffolds were created for identified class objects except for personal information. Scaffolds in ruby provide a style sheet, a migration file for the database, a controller for page interaction, a model page for variable augmentation and 4 html pages.  Pages consist of new, show, edit and index with a default form for the attributes from instantiation.  Actual testing modules and coffeescript pages were also created but were not implemented.  Model of site were to fit the processes of the use cases and diagram charts from previous documents.  

Unit Testing
Unit testing was for each page.  Whenever a page has an error, the user is sent to an error page with the details of what has happen.  This is excellent is hinting on how to fix it as well.  Every single webpage was tested to see if it was accessible via the browser.  Once retrieved, each of its links or attribute was tested.  Each new page of a scaffold was to insert the given information into the database for a record visible from the other three pages.  Show displays the data.  To elaborate, media files in the form of video or image must be present.  Social media link should go directly to the external page of user’s input.  Edit will allow you to change any attribute available from the form.  Index is sort of a home page for all records found in the database of a given class.  This sequence was repeated for each scaffold created.  Heading styles and the application’s navigation links should also be reliable as designed from page to page.  Sample image from fileload page is attached below. 

Integration Testing
Database was scripted to have the login object’s id be a foreign key of the social media, visual media and student activity objects.  This was to ensure that all information associated with a specific login id would appear on its page.  Once a login was created, the tester would go to the show page.  From show, I could then go to one of the other scaffolds to create a new object.  The login id would be a hidden field passed as an inherited session variable.  Returning the login’s show page should now display the newly created video file, image file, Instagram link, facebook link, twitter link and activity update.  Additional route coding was added to the routes page to create the appropriate link for the necessary HTML actions.  Images are added below from the first test with internet explorer where the visual media will play.  

Second image is from Mozilla firefox where activities were added but video is not supported.

System Testing
A review of overall system performance was the final stage of system testing.  Unit testing and integration testing made this simple.  Aside from review, database was checked to see if information was entered and updated.  This was completed by inserting data via the website and using the sqlite terminal to check the tables. 

Testing Results
 In the early stages there were several errors as misspellings would lead to objects not found errors.  Working through those sorts of bugs allowed for the more technical aspects of the intended design.  There was a lot of trial and error involved as changes were made followed by running the application to see the effects.  One regret was not getting the user authentication page working.  Authentication was supposed to check for the entered login name in the database.  If found the next step was to take the encrypted password found, decrypt it and match it to the user entered one.  One of the problems encounter was creating such a record would be non-efficient duplication in the database.  If the information was not entered into the database, it would not accept a direct match as the primary key was not found using the implementation of a session variable.  If I were to use a non-session variable, user would just be sent to an empty login screen.  Other than that, the final product performs to a very close proximity to what was expected.  There were some flaws of videos only viewable in mp4 format from the internet explorer browser.  But I was not able to modify that from a design standpoint.  It was maybe a conflict with the development environment or source coding.  Also there was not enough time for mobile conversion.  Overall the application meets many of the requirements requested are the final results.

High End Computers and Computational Science

The term “high end computer” will have varying definitions.  Most of the creations which fall under that label are between the early traditional or simple pc and very large supercomputers.  In fact, as personal computers became more popular and readily available, the prices also went down.  Then generic or cheaper alternatives were designed as ones with increased performance and capacity began to hit the market.  The blending of the affordable pc with the superior supercomputer provide a machine where the possibilities for what it can do are nearly endless and it can fit in one room of your home.  If a uniform high end computer can be designed, it will involve the computer being expensive and having elite processing capability.  The cost and performance will go hand in hand as processors, memory, graphics cards and hard drive will play directly into both.  To further elaborate, we are talking about processors with multiple cores with speed measured in gigahertz, synchronous dynamic memory with large amounts of gigabytes dedicated to active services and applications, very high capacity graphics cards and solid state drives that can hold terabytes of data [1].  The trend in technology is the new always seems to fade out the old.  Usually the intricate differences are so subtle that only experts of the field can explain what truly separates them.  The latest technological devices come at a higher cost than the previous iterations and the enhanced performance is just part of the package.  These computers are commonly associated with research experiments, gaming and software development.  One more industry which uses high end computers is Computational Science.  Computation Science is a budding field that draws a line to not be confused with Computer Science.  From an article posted in the SpringerPlus journal, the author defines it as “being the application and use of computational knowledge and skills (including programming) in order to aid in the understanding of and make new discoveries in traditional sciences, as opposed to the study of computers and computing per se” [2].  That classification describes how computers can be used in a variety of ways to assist with STEMM projects.  STEMM stands for science, technology, engineering, mathematics and medicine.  Aside from the somewhat cross referencing of technology, computers are primarily used to process large quantities of information, break down and solve complex equations and store relevant records.  Where computer science focuses on the components of hardware and software computational science focuses on the uses for other fields.  The interweaving of high end computing and computational science enables a very formidable tandem which has advances computing many areas. 
            One of the primary reasons for these collaborators, as previously stated, is research.  When research needs to produce results it can be in the form of data visualization.  After gathering so much information, reporting has to be condensed to the facets of the study.  The numbers are broken down in divisions of categorical distinctions.  From the vast total the research can generate representative parts of the sum.  Data visualization is the process of converting the information into models that can be interpreted by other individuals who are not certified experts of the field.  A component to Computational Science can be the talent of reproducing graduate level research to be received at a novice’s perspective.  Though they will certainly be able to understand it as well, it is intended for a larger group.  Data visualization has been known to help both very effectively.  The following excerpt from a high end computing publication give some hints into the procedure.  “The scientific method cycle of today’s terascale simulation applications consists of building experimental apparatus (terascale simulation model), collecting data by running experiments (terascale output of the simulation model), looking at the data (traditional visualization), and finally performing data analyses and analysis-driven visualization to discover, build, or test a new view of scientific reality (scientific discovery) [3].”  The computers assist on each integral step.  The designing of the model and how it will operate is just as important as formally collecting the data.  Computable models are the models that are of prime interest in computational science. A computable model is a computable function, as defined in computability theory, whose result can be compared to data from observations [4]. The model design can be about deciding the actual form of the input and output.  Input can be as simple as inserting numbers into preset forms or multifaceted like providing hard files and having the data parsed.  Then telling the computer what to do with all that evidence is the computational side of the output.  For data visualization, output will be a chart, graph or map.  Researchers can present the details of why the model’s data is isolated to create the image it does. Some of the variations include bar charts, histograms, scatter plot chart, network model, streamgraphs, tree maps, gantt charts and heat maps [5]. Would there be a loss in quality or relatability if a histogram was chosen over a gantt chart for example?  Discuss the details of what validates the choosing of evocative data sets can be very interesting niche for this study. All of these, however, use the power aptitudes of high end computers to create high resolution images with rich colors and niceties as the result for computational science.   These images easily translate pages and pages of raw data to a much more digestible format.  Research that may have taken years and thousands of contributors can be reduced to a two dimensional appearance viewable from one page.  
            Three more examples for applications of this combination is parallel computing, grid computing and distributed computing.  From a computation science perspective, grid computing generates individual reports from locations as the information is fed into a primary resource.  Distributed computing would rather produce a comprehensive report using multiple network node to collect data.  Lastly, parallel computing uses one main source to assist all the others nodes of a network.  Grid computing and distributed computing function in a very similar as the network architecture is almost identical but the end result as well as root cause can be different.  A perfect illustration of this concept is the BOINC program at the University of California, Berkeley.  BOINC is an acronym for Berkeley Open Infrastructure for Network Computing [6].  Its initial release was in the year 2002.  Since then it has cycled through many participants who readily volunteer their computers and services for the research cause of their choice.  And there are many to choose from.  A recent check of the website displayed over 264,000 active volunteers using over 947,000 machines spread out of almost 40 projects.  Projects vary in topics from interstellar research and identifying alien lifeforms to what could be the next steps in advanced medicine.  Each one has volunteers dedicate time and resources to what they have interest in.  Each person can go through the process of completing an agreement form and download software to begin to be a part of it.  They use several high end computer and maybe not-so high end computers across the network to receive data at an alarmingly fast rate.  Systems of this type are measured in floating point operations per second or FLOPs.  That unit of measure would most likely fall under the category of a performance metric [7].  High end computing can be measured by the application, the machine or the combined integration configuring performance metrics.  Instruction sets of the application processed by available sockets and cores of the system during a given clock cycle can lead to a number for how many floating-point operations are conducted.  To be precise the FLOPs amount calculate by the BOINC structure are converted to petaFLOPs.  The peta- prefix denotes a calculation of 1015.  That quantity is possible from the immense shared space and very little idle time or mechanical malfunctions.  That is an enormous number.  To further explain, imagine speed you would have to move at to do a task a million times in one billionth of a second.  When the data collection is time sensitive errors can arise from user authentication and system authorization.  People gaining access to the network and the network having access to the computer can be viewed as human error if not completed correctly.  Another area to ensure is security to prevent interception or modification of information as it is transmitted.  Data needs to be protected as it is passed from node to node.  In many of the projects the retrieved data is geo-specific, any misrepresentation or alteration of the records can seriously corrupt the reporting of the final statistics.  Encryption and decryption can play a major role the security of the project.  There are countless methods to do this but it will most likely ensure a way for the message to be encoded leaving the home location and decoded only once it reaches its intended destination.  Accuracy is critical to enterprise level operations of this field. 
            Computational Science and High End Computing are the proverbial match made in heaven.  Together there is a give and give sort of relationship to where the prospects of each are enhanced by the other.  They are not completely inseparable however.  Gaming is a major industry for high end computing.  The frame rates of today’s video games are only possible with certain machine and graphic cards.  The minimum requirements for games and application are sometimes spoken about beforehand.  Computations can be done by human beings given sufficient amount of time.  Computers have not nearly been around as long as science and math.  People of these fields have performed in them for centuries.  The majority of advancements that reach the mainstream begin from a human proposed thesis and assisted with, not dependent upon, technology.  But the merger is what allows for the maximizing of effectiveness and time spent.  The combination has expedited and enhanced research in several fields.  It has also been able to broaden the possibilities of what can be done.  Results can be recreated into graphical representation to discuss.  The data conversion provides a visual to better comprehend these often large sets of raw numbers.  In conclusion, as a student of Computer Science I think that the one of the most astonishing feats may just be the idea of the pair growing into its own genre and not within the borders of the CompSci discipline.  I am equally impressed by the component materials needed to build and modify a high end computer as its usage for mathematical and scientific applications.  Hopefully both will continue to flourish in the future with their ingenuity and popularity.  And the next evolution maybe just around the corner.

[1] Origin PC Corporation.
[2] McDonagh, J, Barker, D. and Alderson, R. G.. Bringing Computation Science To The Public. SpringerPlus. 2016.
[3] Ostrouchov, G. and Samatova, N. F. High End Computing for Full-Context Analysis and Visualization: when the experiment is done. 2013
[4] Hinsen, K. Computational science: shifting the focus from tools to models.  F1000Research.  2014.
[5] Data Visualization.
[7] Vetter, J., Windus, T., and Gorda, B.  Performance Metrics for High End Computing.  2003.

Report on Scala

                Scala is a program language created by Mr. Martin Odersky and others.  It is intended to be an elegant blend between object-oriented design and functional programming.  There is a belief within its ranks that every function is a value and every value is an object.  Development began on the language in 2001 followed by the initial release in 2003.  After an attempt to improve Java, the project spun off into its own formulated specifically for component style software engineering.  The data types of the language are common.  Numbers can be in the form of doubles, floats, longs, ints, shorts and bytes.  Strings, long form sequence of alphanumeric or Unicode characters, are from the Java library.  There are also individual characters as chars and true/false values as Booleans.  There is some uniqueness as units, iterables, maps, options, sets and lists are added to the mix of instantiable data types.  Even “empty” sets are possible with nothing and null objects.  Scala has five primary keywords for creation as class, object, def, val and var are needed to identify user definitions.  Classes are the design for what an object can be while objects are just a single instance.  Another keyword new is used to convert the class to a created object.  The main method to run an application is customarily contained in a user defined object.  The def keyword is used to create functions.  The keyword is followed by the name, the parameters in parentheses, a colon, the return type, an assignment operator and the set of instructions contained in brackets.  Val signifies a place holder where the assigned value will not be changed while var can be modified later in usage.  The syntax calls for either the val or var keyword followed by the name then colon, datatype and finally with what is being assigned to it.  Scala also has user defined types.  The keyword type is placed before the name and then you assign a predefined type with the assignment operator.  As file organization goes, Scala is very similar to Java in the matter of naming convention.  Also like Java, Scala uses the keywords package and import to define project scope and add external files respectively.  Scala features two forms of generics.  One is the traditional abstract class where you provide a simple framework to be used later.  Very analogous to abstract classes is the concept of traits.  Traits also allow variables and methods that are inheritable to another class.  The inheritance occurs with the keyword extends.  The capability of concurrent processing is very heavily associated with the ‘java.util.concurrent’ package.  There are two points to this topic.  The first is having two library methods of callable which returns a value and runnable that does not.  Then you will have to look at various of threading possible with synchronous and asynchronous tasks.  Where Scala shows promise is in how vast and dynamic it can be.  The possibility to implement nested functions is another positive addition.  For me, a drawback is how closely relatable it is to Java and running on a Java virtual machine. 
                The difference between Scala and C would be akin to comparing Java and C.  Start with the time period to get a better understanding.  C was created in 1972 and Scala in 2001.  C is the basis for many programming language concepts since its inception.  Scala is a fairly newer language and is heavily dependent on Java.  Much of the grunt work that developer had to do in C has been made much easier within built-in libraries for Scala.  The major difference you could point to would be the same in comparing other languages.  All languages will have a particular syntax for committing common tasks.   The level of complexity is dependent upon the features included into the language but again they maybe more time dependent as concepts were created by pioneers and deciphered to be made simpler for users to come later down the line.  To complete project one in Scala would be helped by the automatic sorting array feature but much of the rest would be similar.
[2] Odersky, M. et al. An Overview of the Scala Programming Language.  EPFL Technical Report. 2nd Edition. 2004.
[3] Odersky, M. Scala By Example. 2014.
[4] Odersky, M. A Brief History of Scala. 2006.
[5] Venners, B. and Sommers, F.  The Origins of Scala. 2009.
[7] Concurrency in Scala.

Sunday, December 18, 2016

Right to Privacy

I value my privacy but I have learned at this point of my life that revealing information has its price.  Privacy is more about simple respect than having something to hide.  Sharing creates the glass door effect that illuminates others to your presence willingly.  Privacy should not be a concept to be violated unlawfully.  It is a right afforded to every United States citizen implicitly by the constitution and specifically with the Privacy Act.  Sadly, the same cannot be confined of every citizen of the world.  This topic can be sequestered into a human right debate when honestly assessing what the government of a country should have access to.  But I will try not to the veer off into to the deep end too much.  Personally for me it is my creative space.  Being left alone to think allows me to focus and put forth my best efforts.  In a group or public setting, I can function in a read and react situation but many times my initial response will not address the underlying crisis or the overriding subsequent steps that surely will follow.  I need significant time with myself to consider a solution to detangle the utmost conceivable possibilities.  So for me, solitude is very important.  Isolation can be a form of self-seclusion for a positive impact.  Loneliness is the negative connotation of privacy that demonstrates a quarantining affect.  However, we also live in a time where people easily waive that right with the advent of social media.  You can find everything from nude pictures to intimate stories on the world web wide.  I guess whatever you like you can find in whatever way you see fit.  Sometimes you can even find out what you do not like.  Privacy is such an issue where individuals can comprehend people get angry from reading what another has written.  Personages can go to someone’s webpage and tell them they are offended with what was seen.  Freedom of speech is only preferred when someone agrees your words are feasible and worth their acceptance.  It’s not about just being yourself anymore.  It is really about appealing to the most people possible.  Like-mindedness can become group thinking given enough time and space. 
                I cannot envision the situation being reversed where forcibly imposing yourself on someone can be beneficial to all parties.  Physical attacks or assaults are common criminal offenses in many regions of the world.  Invasion of privacy in the form of computer hacking and videotaping are real as well.  Cases have been presented in court where information or images were released without the aggrieved party’s appropriate consent.  There are other forms with frequent occurrences but as stated earlier I will try not to make that the center of the discussion.  I suppose it might be a crucial element from a law enforcement perspective.  When a threat is identified, it will have to be intercepted and nullified as soon as possible.  Warrants along with probable cause sparks the negative sort of interest at which your personal space can and most likely will be invaded.  So abiding by the law will procure each of us the safety and security that we enjoy.  While breaking the written law can eventually have your rights revoked is the moral of the story.

Saturday, December 10, 2016

Supplements For Health

My experience with herbal medicine is a very positive one.  Now admittedly, more often than not, Doctor prescribed medicine is the best way to resolve health issues.  This is not written to condemn modern pharmaceutical drugs.  It will be to provide insight into alternatives for less severe ailments that you might experience from day to day.  Shopping online from sites like Amazon or in person at stores like Vitamin Shoppe and Wal-Mart can deliver very affordable supplements that can improve your overall health. 
                Let’s begin with some of the basics.  Many people cope with various forms of pain.  Everyone from your strongest professional athlete to maybe not-so-in shape person with a desk job where they perform at a computer for eight hours a day.  Mild pain relievers like Aspirin or Tylenol are readily available all over the world.  Extended usage in terms of time or over usage in terms of dosage can lead to more health concerns.  An alternative is ginger.  Ginger is an excellent anti-inflammatory and pain reliever when taken the proper dosage.  It also has anti-microbial properties that can help your immune system fight bacterial, viral and fungal infections.  For individuals like me, it provides immense assistance as my system is very sensitive to aspirin.  When taken low doses I can experience gastrointestinal problems and higher doses will worsen conditions which other ailments will show. 
                Additional supplements like milk thistle, garlic and cinnamon also provide exceptional health benefits.  Milk thistle focuses on the healing and regeneration of the liver.  The liver is essential to the body for fluid circulation, protein synthesis and maybe most importantly detoxifying your body.  Sometimes when feeling ill taking milk thistle gives that magical cooling feeling the gives indication that something is healing.  A healthy liver is mandatory for your anatomy and spirit.  Garlic is similar to ginger in terms of being an anti-microbial agent.  My interaction leads me to believe garlic is stronger.  Also many garlic supplements are in the form of concentrated oil which moisturizes the heart and patches up your arteries.  It helps with acne problems and cholesterol as well.  Cholesterol has its good and bad forms.  Garlic can lower the bad form of LDL by chipping away at it as it passes through your system.  It will also increase the good form of HDL, as previously stated, in improving the condition of damaged arteries.  Cinnamon also has anti-microbial properties.  There are forms of the supplements that are combined with chromium III picolinate.  The combination is very supportive for maintaining blood sugar for diabetics and weight loss.  Counter balancing the sugar in your body is a main function of cinnamon.  Reducing it can increase your metabolism and treat inflammation.  Once your metabolism is sped up then your body can work at peak performance.  The cliché of firing on all cylinders fits in here. 
                Three other supplements I would like to add to this entry is soy lecithin, melatonin and acai berry.  Soy lecithin is a form of protein as a supplement.  With its basis in soy products, it can be primarily for vegans and vegetarians getting a quality source of protein in their diet.  Melatonin is a sleeping aid for whenever you start experiencing restless nights.  Taking can enable a deeper and longer night’s rest.  Acai berry assists in removing stubborn waste from your body.  Additional water and sodium in your body causes bloating and overweight appearance.  Acai berry really helps in “flushing” those substances and other toxins out of your system when taken consistently. 
                I could write about a few more substances but I do not want to be confused with an expert.  I have crossed paths with supplements like goldenseal, pau d’arco and collagen.  Mostly for infections and skin conditions.  But I will not elaborate because maybe a physician or dermatologist can speak about something which can be more effective.  There really is not a middle ground when it comes to health.  Some substances that do not cure will strengthen the disease your battling.  What I discussed was helpful but is not a substitute for your doctor’s advice.  Supplements will provide significant steps towards progress.  And that alone can be worth taken into consideration.   

Monday, November 07, 2016

Start @ The Beginning

Millennial thought process
Centennial work flow
Original penned piece
Defending you as devotional labor
Religious affair
Decisions were absent
Making the best of what I had
Chicken salad from chicken blank
Reserve your place in history
Preserve an image to be seen later
Weigh your options
Clay to mold
Play the field
Stay at home

Tuesday, October 18, 2016

Gentle Resurgence

The stagnation of inspiration
Can be misconstrued as procrastination
Without the visualization
Of my fascination
That oh so special sweet lady of my dreams
For me its always more than what it seems
The distractions of life
May create strife
If words do not breed the enunciation of right
Im just not certain its the same
Her name
Well I wont provide that for now
It is one
Ok maybe two
Sorry I cant lie its could be a few
I aint a playa
I dont commit to what I dont know
Need to see the signs that its really for sho
We can stand eye to eye
And toe to toe
Which leads to movement with belly to belly
And lip to lip
But not much more will be described in this
In time we will get closer
When I can express this on my own
Without the use of a phone
Then and only then
Will this be confidently put to rest
As she drifts into the night
On the same bed
With her head on my chest
Good night baby

Monday, September 05, 2016


Quick Question
If I thought of another way to say this
Would you respond the same 
Would you still play with it
Would you let it rattle around in your mind with patience
Or just recite the words with immediate flagrance
Asperse my syllables
Receive residuals
Pick the definitions carefully
If I missed a line in the rhyming pattern
Is it something to mention
Or accepted as a new phase
Turn the page
Urn the cage
Putting together whole lines
Shouldn't sever cold ties
Reasoning behind the simile
Treason defines symmetry
Expand the concept form to consider
Jam the faucet due term thicker
Which is which
Got the signal on the mound
Threw my first pitch
Not in the usual stance
Plan the switch
QB with ten comrades
Wants to move in one direction
While eleven others
In different colors
Fields there is something to be protected
Elected like rejected
Message as accepted

Wednesday, August 24, 2016

Benchmark of Women's Leadership

Women having roles in leadership is the topic of this article.  This piece was the official report of a study performed by the White House Project.  It discussed how the fields of Academia, Business, Entertainment, Journalism, Law, Military, Nonprofits, Politics, Religion and Sports would all benefit from putting more women in charge.  It later goes to recommendations on how to acquire more leadership positions and what are the important things to know about the appearance of each industry.   
Now, I do not think the topic is misguided but I do feel the perspective can be questioned.  To say we need more women to rise up the ranks is fair.  There is always time for a momentum shift in society where a perceived lower class can be beneficial to everyone if they were able to achieve equal opportunities.  The data as it is presented does not necessarily say how the adjustment would be favorable to have fewer men in those positions.  As a minority, I understand the call for change.  We can all say why we need to see more people like ourselves in the world.  It is not just for diversity but for the potential of the next generation to have hope.  However I also know that if I cannot out do the person currently in the job or stop the progress of them to maintain themselves as the standard then my efforts will be nullified.  I feel the study could have been helped by concentrating on specific women that would enhance the job if they were to have it versus emphasizing statistics for why so few women are leading men in each division of the professions.  To condense this topic to certain women that are the best examples would be a better argument for its cause.  Then if proven to be true then they could open the door for more who have real capabilities of producing improvements over what was previously done.  The report does not discuss why the numbers are the way they are necessarily but only that they should be different.   
If we were able to learn about the women currently in each field, it could add to the reason their proposed alternative should be implemented.  We can have examples where the top earning men are compared to the top earning women and see the disparity.  We can also use illustrations of the upper and lower levels of women and determined how that came about.  Let’s study if that segment of data is copacetic or if there are biases and preferential treatment being accepted to get to that point.   
I learned that women are underrepresented in many areas of leadership throughout the workforce.  Nominating women for roles and comparing current participants would provide more accurate information for someone who is not aware of the situation and using a novice’s viewpointThe report shows leading to correct the course is as important as calling for more representation for results. 

Usability Study Of Frostburg's E-Mail System

This study was intended to assist in the improvement of the Frostburg State University email system development from Microsoft’s outlook.  Microsoft outlook is one of the many products developed from the corporation that has become trusted among countless users.  However, I recently experience some difficulty.  After subscribing to the discussion board links, my inbox was immediately flooded with messages.  Messages for every possible transaction that occurred.  I was notified of every thread created and reply to any post.  I needed to filter these to a new folder.  I have created a new folder but I have not filtered messages from my inbox to one.  I looked for how to do filtering and could not figure it out.  Then I did a search on how to complete the task and discovered it was called sweeping in the outlook system.  This was new to me.  Once I was revealed to the terminology, the steps to make the adjust were easy to follow.  It was just getting to the first step that was challenging.  Then I wanted to consider other aspects of mailing.  For instance, I have never intentionally flagged a message and do not know what someone might.  It is similar for automatic replies but I have a much firmer understating for why that may be activated.  So this is the basis of what this study will focus upon.  My goal is not really self-serving.  I would like to find that I am the only one who has found trouble with some of these tasks.  I have asked for some assistance in gather information from some classmates to get a sense of other user experiences. 
                I did not conduct face to face interviews for this assignment.  In contrast, I sent a questionnaire to each person who said they were willing to participate.  Each contained fourteen questions, for the user to complete and return when done.  All users had at least one email account.  Not all have used all the features though.  This indirectly states that these may not be of necessity.  None of them have used the filtering.  One participant said they did not because their inbox did not require it.  More users have flagged messages though.  Flagging a message activates an alert to the message.  Users have said they have used this to as a reminder for future use or for indicating a certain sender.  Those can be instances for filtering but that was not the option chosen.  To filter a user will have to create a rule and decide a string that will capture those messages.  For the examples provided, you could move those message to a new folder or select something specific that would be the sender.  Flagging uses less steps as you only have to click the indicator next to the message in many mail servers.  Some of my reported users have tried the automatic reply feature.  Automatic reply can give a pre-entered response for any incoming messages you receive.  This is an appropriate replacement for whenever someone will be temporarily unavailable for an extended period of time. 
                Oddly enough, the participants answered that all the features were necessary to their daily experience with the checking and reading of e-mails.  Even if they do not personally put all of them to use, they feel no need to remove it as an option to the application.  Except for one that I will discuss more later.  This a fair reaction to my study in my opinion.  When requesting information from others, all opinions are valid and assist to your cause.  Though they might not personal attest to its usability, they think all are usable.  The features are important to the application itself as suggested by the software’s provider.  I cannot express with certainty that a similar study was done on their behalf but I am sure there was some talking about it before the final release.  I did not ask the users to perform these task for my report because I preferred a natural response to if they already find value in the function prior to being asked about it.  I find there is more information in having experience beforehand than reacting only for my request.  These results are surprisingly positive, for the most part, about how we view this technology. 
                There are two standard definitions for the word usability.  The first is capable of being used.  The second is convenient and practical for use.  From my own thoughts in addition to those of my contributors, I find that these features are obviously capable of being used.  That is not the issue of discrepancy here.  However, the reports show that it is not really practical to do so.  Though it was not an overwhelming number, the skewed results suggest flagging is preferable over filtering or sweeping.  I, personally, am on the other side of that argument.  A larger number of responses may say something different but that is not conclusive.  To indicate a message needs attention is not like sending it to its own section.  As a good designer, either can be used which is not true for all servers.  Kudos to Microsoft and Frostburg for that.  Automatic reply was not heavily used either but I know firsthand that is more of something derived from corporate culture and not necessarily a student’s problem.  Whether on vacation or just away for a certain period of time, employees usually use the response to route urgent questions to someone who can help when unavailable. 
                In conclusion, I found this to be a healthy practice.  The testing of the features and user input provided some valuable information.  The lack of contributors I could gather however does not allow me to implore if an alteration is mandatory or not.  This report can be presented but a larger swath of data is needed to discuss if an application of this size needs improve.  There are suggestions too for the sweeping feature.  One is to have the option of user sender and date which I think is available.  Another is to undo but would it be fair sacrifice many to address the request of one.  Part of the usability study conducted was to enhance not detract.  For this sort of application, I do not feel it is required to remove any feature that is part of the design.  Removing would be the same as not using it if you do not like it.  It is not a core mandatory component that you must interact with for the main functionality.  Safe to say as with all technology there will always be periodic updates for the advancement of software.  And I will paraphrase one response as simply “I am happy with what I see”.  Thank you.

Tuesday, August 16, 2016

Summary of the Book The Design of Everyday Things Written By Mister Donald A. Norman

Chapter one address the mindset of how we interact with common elements.  The author, Mr. Norman, starts by discussing how doors are used and how they can be misused.  Frustration can occur when there are no visible signals that the user is accustomed to.  He then talks about the concepts of discoverability and understanding attributing to the complexity of modern devices.  Discoverability is about how actions are to be performed and understanding deals with interpreting what the conditions mean.  There should be an inherent naturalness in believing what you see.  There is not too much more to comprehend or decipher from design is what I think he is trying to advocate.  That is until we encounter a difference.  This critique introduces affordances and signifiers to the reader as what actions are possible and what might indicate how to accomplish it.  This relegates explanation for some situations while others are to be understood before any action is taken.  Conceptual and mental models feed into our idea of the system image.  Mr. Norman reiterates the importance of visibility and mapping before closing the chapter on feedback and a soliloquy about designing things well as more difficult than you would think.  The acceptance of technology faces a testing battle in its early stages.  Then through maturity and feasibly user spread ideas, good design becomes widespread and easier to manage.
                Chapter two begins with an anecdote about his landlady.  This evolves to a discussion on user error and whether it is partially due to design.  There is an allegory of two gulfs for people with new technology.  The first is for execution of figuring something out and the second is what happens as evaluation.  Bridging the gap between the gulfs are the seven stages of action which is, ironically, talked about in different sections.  Determining the goal as discoverability, planning for feedback, specifying the concept, affording the performance, perceiving the signals, mapping your interpretation and comparing the constraints are the main points of that topic.  Blaming yourself for errors caused when using devices is not a healthy practice.  The task of designing items so their intended use is infallibly mistake free is not your responsibility.  In a previous edition of this book, the author then goes to discuss misconceptions of everyday life using the intriguing example of bullets which is valuable for this chapter.  Physics does not necessarily explain how one fired from a gun will take so much longer to hit the ground than one dropped from a hand.  The speed it will travel horizontally has a direct effect on the rate at which it descends.  This leads to people being explanatory creatures as events like this are proven true or false by human collaboration.  He then disputes his earlier point by saying that people do not blame themselves all the time for events and look for a cause to dispense culpability.  Learned helplessness comes from repeated failure.  Positive psychology tries to acknowledge that these things happen and something reaffirming can be gained from it. 
                Chapter three inserts a smarting theme that behavior is contained by one’s knowledge.  How much we know will dictate how we act for particular circumstances.  Though knowledge is always in the world, the requirement of it reduces as we improve in other areas.  Reduction is the need to simplify things and give structure.  This is the case when dealing with precision and memory.  Memory is the knowledge in our head gathered from the world and retained for our own purposes.  Memory in humans, like computers, can be either in short-term or long-term form and have no relationship or linked by a direct correlation.  There is a natural mapping effect to that concept that draws from culture.  This is on display with his question concerning the timeline paraphrased as what is in front of us and what is behind us?  This is primarily directed by point of view but we should allow a subjective amount of time for users to assimilate to the newer ideas of the world. 
Chapter four begins recalling points for the previous chapters.  The two types of knowledge and the component that factor into each.  The author then uses the example of Legos to support his point.  The pieces, as they are separated, do not provide instructions for children on assembly.  It does give a visual as the finished product and the constraints that only allows them to fit together a certain way plays in the final determination.  Even with constraints being an indicator, some trial and error will actually complete the design.  There are four types of constraints.  Physical constraints allow for a strictly visual interpretation for the user.  Cultural constraints apply to individuals whose should conform to certain standards and conventions of those around them.  Semantic constraints deal with understanding the meaning and defining the actions required for use.  Logical constraints is the inherent order of the design and subsequent result of a device.  Mr. Norman add affordances and signifiers to the discussion using the models of doors and light switches.  Afterwards, he introduces a new concept of constraints that force certain behavior.  Constraints that do more than prevent and have varying locking aspects to them.  The chapter then goes to design in the form of the faucet.  Faucets tried to acclimate to the user with different ways to control the temperature and rate of the released water.  It draws to a close with the advice that sound is necessary from a cultural standpoint to alert other people.
                Chapter five revisits topic of human error versus perceived bad design.  When error happens we should try to discover a root cause to the matter and five additional questions to ask why about certain particulars.  Another concern that can be interpreted as error is violations.  Deliberate violations are intentionally done by users and maybe punishable.  A major cause of violations can be rules that are inappropriate to encourage damaging acts.  Errors can be classified as slips or mistakes.  Slips employed improper actions leading to the correct goal.  Mistakes do not meet the right ending.  Slips are classified as capturing the wrong activity, description-similarity for confusing the target, memory lapses and mode error given the controls different meanings.  Mistakes are classified as rule-based where identified practices must be given but not followed, knowledge base where lack of user expertise can be involved and memory lapse again.  When errors happen, the key is to learn of them quickly.  Once they are it should be reported by the user or the witness to the event.  This can help the manufacturer design for users by knowing what can go wrong.  It is simple to design for a device to perform perfectly for intended parameters and usage.  A study of errors might actually improve design. 
                Chapter six begins to remind the reader of how to analyze errors by mentioning he never tries to solve the problem he is asked to.  The root cause maybe separate from the five whys is what I think he is inferring.  Finding the right problem first is just as important as finding the right solution.  The human centered design process encompasses observation, ideation, prototyping and testing.  This assist in the natural challenge of design where nothing last forever and can always be improved.  Something is usually designed because there is a need for it.  Therefore, it is being created to address a past void and not for what is to come next.  It must meet the requirements of the designer, end user special cases and every intermediary along the way.  So complexity is good until it causes confusion.  To evade confusion standards should be reinforced. 
Design for the business world is the theme of the last chapter of this book.  There are two overriding concepts of competition and innovation.  Competition has a confusing aspect where organizations try to separate from each other by using the same ideas for the same products.  Very slight yet intricate differences are the dividing line between sport and lawsuits.  That also involves innovation in terms of being either radical or incremental.  Products that exist need improvements.  This can be dangerous experimentation when customers are happy with what is current.  The features that are suggested by research to be added on, can be just enough or excessive.  This can jeopardize the certainty you know for the possibility of reaching.  And new products face definitive challenges as implementing change and how long does it take to be accepted.  The change can be forced but customer purchase product with a conforming mindset.  A key statement he makes in the end is that as nearly everything changes around us, there will always be some fundamental standard that will endure and be relevant in any time period. 

Summary of the Book User Task Analysis and Interface Design Written By Miss JoAnn T. Hackos and Miss Janice C. Redish

Interfaces are primarily how users interact with the product.  Good design is important to this feature being productive.  This is not an easy task however.  Problems can arise from making the application too complex or from focusing on the product more than actual user requirements.  The author is writing the book to help with doing this successfully.  Interfaces are usable if the user can perceive it as such.  The author goes on to emphasize that user and task analysis is the first part of any project.  This is done to address the seemingly systemic issue of bad design.  Correcting it in the early stages will save money during the rest of the process.  There are many components to this type of analysis.  Study of people and culture play into it.  So does studying how people think, learn, communicate and behave as consumers.
                Learning about and thinking about the user is the theme of chapter 2.  We should want to know their intended user and about their decision method that leads them to choose the product as designers.  Research who the users are, who they share the product with and who else may be expose to that activity.  There is sort of two parts to the user profile.  How do we define the user and how they define themselves?  We can implement a research team and try to discover the ideal group we would like to connect with.  We can then look at self-revealing facts like employment and actual usage by the user to see if it matches the preconceived assumptions.  The results of the study should show the differences of users making it more personal for individuality.  The more data you have will give you a better strategy into promoting your design.
                After assessing the user then we should discuss the tasks.  The “task” itself is about how this product will help the user work or develop something to work.  This can be inspired by seeing how tasks are accomplished without your concept.  Again, start with the user’s goal.  Then see what issues they may have when trying to complete the task.  You can then think about refining the process or filling the void in some way.  Analyze the workflow of the user with the task of the job and combine the grades for your own process.  Reanalyze your initial finding as your users mature from a novice level to becoming an expert.
                You have to incorporate every aspect of the user environment in the scope of your design.  There are many things to consider for the physical environment.  The size of the work space is one of the characteristics to consider along with noise, cleanliness, lighting, room temperature and dangers.  Designers should also comprehend the speed of system reaction and the sources of information the user will need.  All the characteristics and any possible dangers should be discussed with engineers before proceeding.  Social environments are about the processes of a task and how they will be divided.  This can vary highly and is subtly suggested to not be presumed.  Cultures permit some disciplines and vocabularies that we must pay attention to and be aware of.
As you make assumptions, you may need to observe the users in their environment in action.  There are many cases that can be presented as resistance to this procedure.  This may be generated from your own organization or the designated users themselves.  Once you can verify its essentials, it is recommending to complete tests on a small group of users to challenge or confirm all the information previously recorded.  As you prepare the business proposal, you will need to calculate the approximate return on invest and where the result will be in proximity to your competition.  Once this information is collected, you can supply it in a suitable format to management.
The techniques to use of site visits in critical and can play directly into your result.  The way in which you address and converse with individuals about inquires and tasks are important factors.  Part of your goal is to get them to become willing participants in the study with events like role playing, interviews and walk-throughs.  This type of work is to provide a better product as the main service but connecting with your customers in a friendly manner is not a negative.  You should also be able to perform more traditional assignments and share how you handle your job to improve theirs.
Setting up and preparing for the visit, I assumed would be very similar prior to the reading.  You cannot skimp or take any short cuts in setting up for the event.  Secondarily, you have to prepare with your team and make sure there are not holes in the set up.  Make sure that the plan was correct and perfected and then follow through with accurate timing.
As you conduct a site visit, you should use the opportunity to refine your observation and interviewing skills.  See how the users work in their environment.  Try to define their job and how it is completed.  This will only help to assist in the continued planning for identifying the task and its improvement.  Begin to remove the assumptions and inferences with the realized outcomes of your view.  Also as you conduct your interviews pay attention to verbal and non-verbal responses.  Specify the information you intend to learn but be weary of forcing participation and making participants uncomfortable.  Your notes should contain quotes of user answers and other information that they felt could contribute to the reporting.  The interviewer should have a well-thought out plan.  However, since you are working with other people who have their own requirements, you should be flexible as to what you can be stringent on.  Ending as a good experience is as important as gathering the data.  As you analyze your time on the visit and information provided, you can begin to format it for presentation.  Categorize the data to how it was collected by user, environments and tasks of their associative variety.  This will emphasize who provided the corresponding information and how it may relate specifically to their situation.  You want the presentation to become an accurate representation of the work done.  The best methods to use for your presentation will be dependent on the other members of your contingent that is responsible for completing the assignment.
You should begin designing from what you have learned.  Metaphors is imperative to the concept of interface design.  They are the connection from the supposed real world to the virtual world.  Scenarios and sequences should be used to both prepare your organization for the product and when you are beginning the roll out for waiting customers.  Diagramming with models and storyboards are key to visualization on the process of creation and hierarchy of the developed ideas.   Prototyping is the next step of design.  Now prototyping can be anything from a 3-D object to the first usable version of the interface.  Building and evaluating this item is part of your job.  When testing the prototype, review the initial or provided list of requirements of the user.  It is pertinent whether it is capable to meet those standards as an unfinished product.  You should also see if it can go beyond that in this phase of design.   As a prototype, you can continue to make improvements and corrections from previous research and with the feedback of information gather from testing it.  Request adequate responses from willing participants for enhancing the product.  The last step is for documentation and training.  Provide a thorough written report for what the product is intended to do, how it should be used and all other relevant information any user should know.  Training from trials from within the company can generate into becoming excellent tutorials for others.  An organization should be prepared for in-class and remote sessions to teach about the product and maximizing its effectiveness.  And these should be catered towards the audience and not just written from a perspective of wisdom.  

Saturday, July 30, 2016

I Was Possibly Born Today

Spreading the rumor
Because they might not extirpate any other way
Too much to handle at one time
I’ll numinously age like the finest wine
Using the tempo to expose if I’m clever
Drawing from both sides outside of the whether
Given pigmented skin
Doubles the intentions as predetermined
Heard the thief’s theme and the preacher sermon
Then repent for the contour of what I’m learning
Found it too easy to just be myself
Even diverting to another route remained in stealth
Jokes make them smile
Rage displays the fear
Aggression is mutual
While fondness is partial
At last that’s what I saw
We’ll see how long that can hold on as only true
Nine times out of ten
Name of the game is to let it happen again
But when time’s hard mate
You gotta check for your one move to escape
Provided with sustenance that’s digestible
Till preparing something victual on my own was suitable
And now that’s the usual to me
Susceptible to adapting to the requirements of maturity
Relegated to assistance at the beginning and the end
Banished to reproduce in the intermediate
Just the ways of the world
And the system of life
Bring it back to one
If you dare say it twice
Three’s company
Four-eyes for sightseeing
Give the boy a hand
To be a developed human being
So if I had to assume for seven-up
I might only pick my last
Wondering what was on the up and up
Delivering commandments to follow
Can seem quite hollow
If the perceived model
Does not live up to the bravado
However let me ossify the spotlight
Since we all have our life to live
Yours is yours
Mines is mines
And magnify this day if they were to ever intertwine

Sunday, June 26, 2016

Home Ownership Is Often Considered A Cornerstone Of The American Dream

Owning a home has become one of the mainstays of the ideal adult life in America.  Along with marriage, wealth and happiness for yourself and those in your social circles.  Homeownership is a primary symbol of accomplishment.  It displays you have achieved many of the necessary requirements to meet those illustrious qualifications.  However inexperienced buyers can face hardships and exploitation in a seller’s market where greed can become a cunning offense ending in a bad contract for the new owner.  This can involve the overall cost of the home, the land it is housed on, the area they will reside in and the structure of the building.  Real estate developers have the advantage as they can also decide the price of a listing.  The majority of merchants are very honest in dealing with the public at this.  However, some situations can lend itself to gross inflation in pursuit of additional profit.  It does not really matter if the property is in high demand or if one party is assured to make the purchase.  The bottom line is usually the finalized deal and not the process that led to it.  Houses have been built on unstable land where damage to the property and injury to the residents have occurred.  A portion of these events are caused by natural disasters or unforeseen events which are not included in the insurance policy.  Again there are good retailers and lenders who will mention this information up front establishing the pros and cons.  There are others who will never discuss it leaving an entire life’s work up in the air to chance.  The area can also be confusing to new occupants looking for a place to dwell.  Getting a genuine feel and sense of the surrounding area is something many homeowners do when acquiring their humble abode.  Some try to gather data on that prior to making an informed decision.  While many more will take the realtor’s word as gospel without doing their own due diligence. 
If I would suggest any advice it would be for potential home buyers to take time to learn about this topic.  Do your own research before agreeing to the final purchase.  You can enter the discussion with all the questions you want to ask or ready to have a tasteful dialog where you can stand on your own.  Education is an incipient factor to any inchoate operation one will indulge themselves in.  Learning about a subject can reveal new information and modify your preferences or lead you to be further entrenched in your position.  Use your time studying to know what you like and what else may be available.  Understand why there are so many options and what factors play directly into the pricing.  Psychologically, there is a big difference between what you can afford and maybe what you should continue to save for.  You may have heard of the story of the three little pigs where each built different houses for their own reasons.  Once they encounter danger, only one house was found to be helpful.  Rash decisions can have severe ramifications later that if patience was used would be avoided.  Actually gaining information about land, the surrounding area, building specifications, interest rates and mortgages can give some leverage to negotiate with a realtor.  Now you might not want to bicker about what is discussed but you should be able to make correction or insert suggestion as part of a deal.  This can also enlighten about how someone may try to cut corners and get over in some way.  Proper preparation will lead to favorable results the majority of the time in most situations.  People can offer misinformation but you are ultimately responsible for signing your name on any agreement.  At the end of the day, we have to own all of our failures, successes, mistakes and achievements we create or encounter from the ground up.

Thursday, June 09, 2016

A Question Of Preference: Career or Family?

At this point of my life, the decision between career and family is difficult due to the complexity of my current transition.  The flow of my life is centered on the cogent tendency of completing my education to begin a career while ramifying from the family I ripened under to the one I would love to handpick and develop with.  Both contingents are in flux as what I am doing now leads to what is to come.  However even if I were able to ace my part with flying colors, I am left to still be dependent on another’s approval to complete my mission in its entirety.  Therefore it is a rigorous process to conk out the higher priority as my best effort may not be good enough for each case.  In either situation I would prefer to be a primary option first.  That would provide the secondary route where whatever I have done will be accepted and pursued by someone.  The burden of proof is whether I can take the offered opportunity with gratitude or continue the dogged ambition of only getting what I originally set out for without considering to settle for anything construed to be less.  Then you face the battle of pride versus rejection and perception versus reality.  Begin the procedure comparing what has happened with what your vision was and ponder the question again as the results become nonvolatile and fixed.  Sometimes you will find that all you worked and hoped for can inevitably be revealed as evanescent.  When nothing lasts forever, life actually gains the ability to teach the lesson that complacency is, in itself, a questionable attribute to have.  Contentment can encompass not only sustaining but also enhancing what you have already attained.   Two paths are enabled in both points of this discussion.  The first can be just refining your role and capabilities on the job or delving into a novel and scintillating appreciation for your spouse or partner.  It can also address searching for that long awaited promotion or expanding your family in number by marriage or childbirth. 
If everything was fair and neutral then I guess the question just comes down to which I want more.  A family can provide motivation to reach certain plateaus.  A career, similarly, imparts evidence that you are a valued member of something.  If one were to exist without the other that would tip the scales in the favor of a career.  You could still use the income to socialize, invest and enjoy yourself minus the additional responsibility of maintaining more lives.  A home would not remain a happy one if a family could not be supported by their chief breadwinner.  Nonetheless I know sometimes I would rather return to headquarters where I am loved and adored by those that mean the most to me.  I suppose a different perspective could allow ample time to spend with family after a proud retirement from a successful career.  The balance is confounding as one can always look like the other if you wanted it to.  Honestly I could watch the pendulum swing on this topic for eternity.  In my mind, it is just like asking whether the chicken or the egg came first.  You can make a justifiable argument for either while a sufficient counter is always available to suggest the opposite as the predecessor.  Maybe it would have been simpler to say “I don’t know” until my life is directed one way or the other.  A holier view may well let the issue be resolved by prayer or through faith.  Then I would not have to answer the question since a sanctified providence had an indicator in making the determination with me.  And finally trust and believe that the outcome was the only decision conceivable.  

Wednesday, June 08, 2016

Lady In Red

A girl once said she liked how I rhyme
So I decided to write a message line by line
She hasn't seen my tears
And she doesn't know my fears
But she sees a smile
Whenever her words touches my ears
I don't realize how deep my feelings for her goes
And find it strange that its my emotions she controls
I get lost in her eyes
And stare at her lips
I cherish simple hugs
And daydream about her hips
She shapes my opinion and changes the mood
I still can figure out if thats bad or good
Rambunctious laughter
Is what we share
For the rubric of before and after
Exhibits the improvement came latter
In the moment given
Only caters to whats living
Igniting kerosene
Illuminates the odd liaison
All we see is green
Preserving creation

Sunday, May 22, 2016

Reflecting Heroes & Villains

I wanted to share two of my favorite comic book characters in this entry.  This is partially due to the popularity of superhero movies in the last decade.  These successes have come from the Marvel and DC brands.  There is one villain and one hero I feel paranormal connection to from each.  Though I cannot really explain it, I will try my best to say what I like about each.  The sentimental significance of these beings are predetermined with the labels assessed to them.  So it is easy to want to see the hero win and the villain lose.  What happens when we begin to want the opposite instead?  What happens when the momentum shifts or never starts in some ways as we root against the written nature?  It’s disorientating but it is still entertaining at the end of the day.  Who knows maybe one day I could portray one of them in a movie which has not been done in a full-length film to date. 

The hero is Cable from the marvel universe.  Cable has many powers including superhuman strength, telepathy and time-traveling capabilities.  Now I do not have any of those abilities per se but that is not the whole story.  Cable is the prototype of the antihero.  The troubled individual with an obligation to do good even if it requires bad actions.  He also battles a techno-organic virus which has already complicated him by disabling some of his body.  But the disease has not defeated his evolved martial arts and weaponry skill which is used so masterfully to combat his enemies.  My favorite attribute of Cable also includes baby hope.  Hope is a character who has mankind’s fate placed upon her at a very early age.  Cable sees this potential and protects her by keeping her close.  At times, he evens places her in a sachet of sorts on his chest.  The symbolism of this is the “hope” is always near and constantly growing with us.  We have to nurture and shield her to allow her to reach her predefined destiny. 

The villain is different in every way from Cable except for my belief in his core persona.  Brainiac from DC and Superman comics is who I speak of.  Brainiac, as you may deduce, has superior intellect that separates him from nearly every character in his presence.  He is an expert of almost every science known to man and human behavior.  Its a moiling mystery to ponder whether mental capacity led to his evil nature or if the sequence was in reverse.  Villain roles are more intriguing as you have to reveal why someone chooses the other side.  The hero usually experiences something that makes them want to stop anyone else from going through or feel senses to perform good given their unique abilities.  Villains however use their greater prowess to reinforce the idea of inferiority in their quest to conquer.  The dichotomy alone is captivating.  Brainiac is very advanced and has several forms that appear throughout his tenure as an adversary.

The two characters chosen have few things in common as previously stated.  Cable is far less known than the likes of Wolverine, Spiderman or Ironman just to name a few.  However as in all comics, he is the sun of which his universe revolves.  In his world he is the featured main character and plays a lesser or secondary role for others as a special guest in theirs.  To break down the character in real life may show a person with great resilience, the power to read thoughts or another way of saying it is an enhanced value for anticipation.  Time traveling can be translated into directing the future and defining or encapsulating the past.  And we can always protect those close to us, those we believe in and those we connect to.  Brainiac is the nemesis of one of if not the most popular character ever.  His supreme intellect is a perplexing balance to all of Superman’s gifts.  Speed and strength can be counteracted with genius.  But more than anyone’s competencies or proficiencies are their desire to assist or destroy at heart.  That is essentially defined as good and evil until we place a mirror at the scene and you can finally see the other.  It gets comedic in a sense when all your plans for world domination are foiled by some annoying being who’s always there.  Or when you have nothing to do as people can handle their own issue.  You are reduced to just having fun with your mutated powers.  Both sides need each other but the decisions that play into the division are the most important story to tell.  And every iteration of the contrast of forces is something to write about.

Friday, May 20, 2016

Object-Oriented Programming and Design

Object-oriented programming was created with coders and designers in mind.  It has expanded to many different languages as it has proven to be efficient and sustainable among the options that are available today.  The main concepts of encapsulation, abstraction, inheritance and polymorphism are the core principles this idea.  Building large structures into small manageable objects with multiple capabilities is the valued philosophy of object-oriented programming.  Since its early days at the Massachusetts Institute of Technology, it has become an industry standard and will remain relevant for years to come [1].
Object-oriented programming and design is a topic that covers many of the modern languages we use today.  The focus of object-oriented design, in my opinion, is to create forms of data to be reused.  Java has the slogan “write once, run anywhere” which helps my definition.  Java is a class-based language, introduced in 1995, that is one of the better examples of being able to perform all the ideas of object oriented programming [1].  To define a class with variables and methods means you can then create an object of that class to use in your programming.  It can be used in the current program and if you need to reuse the concept, it can be reincorporated into another program without having to be rewritten.  That is the beauty of object-objected programming and design.  It places emphasis on the coder to feature the abilities of encapsulation, abstraction, inheritance and polymorphism for optimization of the language [2].  These are coding principles specifically for this type of programming that I will discuss more later on in detail.  The term object-oriented began in the early 1960’s in MIT with the development of the LISP language.  I would prefer the languages of COBOL, Visual Basic and Java as examples of implementation since I have more experience with them personally.  I had a course devoted to each while enrolled at Frostburg State University as an undergraduate student.  COBOL stands for common business oriented language actually.  After appearing in 1959 after being generated with government assistance, COBOL is thought of as a procedural language by many [3].  It can be used to generate forms and report on data related to business interactions.  However with the ability to create variables and invoke classes, I will use it as an early example of an object-oriented one.  Visual Basic is considered an event-driven language.  Primarily user create a graphical user interface and then decides what can happen once certain events are triggered.  Events can be defined as the user interaction with the form such as once the mouse is clicked or a button is selected.  Whenever something happens the application should have a response.  Again, I will use it to show some of the OO capabilities that can be found in that as well.  Also let me introduce some terms that I will use throughout my writing.  Variables are individual allocations of memory of a certain data type which can have value assign to them.  Data type are the form of the information.  Some of the common examples are integers for numbers, chars for individual alphanumeric characters, string for multiple alphanumeric characters and Boolean for true or false value also known as either the one or zero bit.  Methods or procedures are a series of commands or instructions that will take place once it is called or activated.  Classes are container elements of some combination of the previous two programming concepts that can then be treated as an object.  It is important to know these definitions before we go much further on the nature of this topic.  From an expert’s point of view this may be fundamental but I will try to write from the perspective of a novice so it may be appreciated by a wider audience.
One of the first components I will discuss is encapsulation.  Some may call encapsulation the process of concealing parts of your program.  I prefer the slightly lighter view of building capabilities into your objects.  It can be as simple as assigning a value to a variable.  Instead of consistently working with a constant value, you can define a place holder to use throughout your program.  As I said, that is the simpler view that can be performed in COBOL, Visual Basic and Java but each does it in a different way.  COBOL uses the pic keyword to assign value to variables in its data division [4].  In visual basic coders use the dim keyword to name the variable and then what type it is.  Java you have to give the type then its name.  For VB and java you can use the ‘=’ operator to place a value upon it.  The more complex side is to create class with its own methods to use.  In java it is common to use encapsulation.  Using class objects to perform specific tasks is a major component to the language.  Creating the objects of your own volition is the more hands on approach.  Within the class you can identify variables and methods which it will use.  There is another aspect of the constructor to deal with as well.  A constructor is called when the object is first instantiated using the new keyword.  Constructors can accept a value or not but what happens in either instance will have to be defined by the coder.  Once the object of the class is usable, you can then interact with its variables and methods as you wish.   Therefore is the main coding section, you can use the object and refer to its contents instead of add that section of code to the main portion.  Depending on the object this can be easier for you and would be the same without encapsulation.  For instance, you can code a method to perform a loop.  This loop could have been created in the main source code.  However, if you were to call it multiple times, it would be relatively simpler by calling the method several times and writing the steps and iterations of the loop once. 
Abstraction, in my opinion, is the middle step between encapsulation and inheritance.  Not the middle in terms of the creation process but it can use encapsulation to lead to inheritance though it is its own concept.  The word abstraction is the quality of dealing with ideas rather than events.   The translation is near perfect because for programmers it is creating the skeleton of a class.  The initial abstract object only incorporates the fundamental framework of what will be used.  Reduce the object to the bare minimum that is required for it to exist.  The class can be absorbed by another class that can then define what its variable contents would mean and its procedures would do.  You can create the class by telling what the types of variables are and the names of the methods only.  Just the frame work is necessary for initial processing which makes abstraction unique.  The ‘hello world’ example is shapes.  You can model an abstract class or interface of shape with variables for height and length and then methods of draw, perimeter and area.  The idea is that all shapes have these properties but they will be defined differently.  The coder can design classes like circle, square, rectangle and triangle that share these same attributes though they are not handled identically.  Each can assign height and length but how they are formed using draw and how the area of the shape is calculated will have to be defined in each class while they use the same name or ‘area’.  Area for circle is half length multiplied by pi, square can be length to the power of two, rectangle can be length multiplied by height and triangle can be half the length multiplied by height.  This is an example of abstraction only because the theoretical parent class has nothing defined in it.  If we were to define any of the parameters in the parent class, it would have to be redefined by the child class which is not optimal to the concept.  For shape there are no universal properties accept the names of its contents not the values. 
Abstraction leads to the topic of inheritance.  Inheritance has parent and child classes.  The parent class will be defined and then the child can inherit those capabilities and add to them if it needs to.  Inheritance lends itself more to the idea of a hierarchal structure than that of abstraction.  Though the two are very similar, the idea behind inheritance is opening the door to use some of the predefined attributes of the parent class.  Usually in inheritance that entire object will not have to be redefine.  Users can do this in two ways.  One is to inherit the parent class and not redefine a specific detail.  This would confirm that you wanted to use the parent definition for the child because some of its traits can be carried over.  Another in java is to call the parent in the child using the super constructor option.  In contrast to the shape example, I will try to discuss vehicle.  Now vehicle in these terms will refer to an automobile.  The vehicle will have four wheels, take gas, have glass windows and require one driver.  Those traits can all be passed to children coded as sedan, truck, van and sports car.  Within the children will holds information like top speed, number of passengers, number of doors and off-road capability to name just a few attributes.  Now those can be variables of the parent that are not defined or only belong to each child.  That would be the choice of the coder.  Either will suffice.  Those are examples of variables.  Some methods would be to drive forward, reverse, turn and park.  If the designer wanted to be specific I supposed the size of the vehicle could be incorporated into park so each child would define it by the type of car it is.  Moving the vehicle can be defined in the parent and still called by the child. 
Sharing certain properties transitions to the subject of polymorphism.  Polymorphism describes similar things existing at the same time.  That can mean variables, methods or even class objects.  Variable and methods can be determined by scope.  Scope entails range in a program where something can be used and is identifiable.  For a variable to be out of range says that it is no longer in the range of which it was instantiated.  You can either recreate the variable for the current context or see if there was another error in the coding.  An example of this is to create a variable to use within a looping algorithm.  If a variable is created in the loop itself, it can be used in the loop.  Once the loop ends, the value that the variable held will no longer be in memory.  Then the variable can be recreated with the same name and used with a different value.  Methods can show polymorphism using the abstraction example.  Calling the draw method for each shape a similar concept.  You can theoretically line up each shape using the draw method in succession.  Rectangle draw, circle draw, triangle draw and square draw will each respond with what it is designed to do.  Though they have the same name and can be contained in the shape class, each shape’s draw will only respond to the class that requested it.  This can be confusing but if the class object is used to identify the method then it makes it easier.  Classes can have more than one object as well.  Using the inheritance example, if I wanted to name vehicle as specific makes or models that can be done using the same class.  Instead of naming my sedans one and two, I could name them honda_accord and toyota_corolla.  In most programming languages the underscore will have to be used connecting the two words to avoid errors.  Both can be sedan and both can have some differentiation used when assigning some values to the variables. 
Object oriented programming has really assisted designers advance coding with its definitive properties.  How encapsulation, abstraction, inheritance and polymorphism are used brings a new significance to programming.  Building new objects with classes that can be defined and processed as data types is remarkable.  It can provide a first-hand level of control with incomprehensible depth of the ability to create whatever you feel possible in your imagination.  Speaking as of today, it may not be the revolutionary concept it was when it was first introduced.  The concept of its intention has remained at the forefront of how programming is performed by professionals and taught by professors.  Since I am not as advanced as others of this genre, I cannot expound on areas of weakness or where improvements can be implemented.  Learning about these forms of subjects can allow the ability to compile all the informational resources within your own comprehension but execution requires a different level of mastery to be fully functional and operational.