Translate To Preferred Language

Search Obioku's Thoughts

Please Read Today's Featured Post

Things I Would Like To Happen Soon

For all those who believe in speaking ideas into existence and prayer in the universe will return to a person.  I simply want to write a lis...

Wednesday, December 21, 2016

Testing Document for Software Engineering

Testing Overview
Testing was performed in the style of alpha white-box testing.  It was conducted from the perspective of the coder with prior knowledge of the system.  Unit testing was done on individual web pages.  Integration testing was from the cross referencing web pages and foreign keys attributes for records.  After each one was conducted, a complete overview of the site as a whole was the last step of system testing.  Testing was based on the requirements the group presented in the design and analysis phases.  Scaffolds were created for identified class objects except for personal information. Scaffolds in ruby provide a style sheet, a migration file for the database, a controller for page interaction, a model page for variable augmentation and 4 html pages.  Pages consist of new, show, edit and index with a default form for the attributes from instantiation.  Actual testing modules and coffeescript pages were also created but were not implemented.  Model of site were to fit the processes of the use cases and diagram charts from previous documents.  

Unit Testing
Unit testing was for each page.  Whenever a page has an error, the user is sent to an error page with the details of what has happen.  This is excellent is hinting on how to fix it as well.  Every single webpage was tested to see if it was accessible via the browser.  Once retrieved, each of its links or attribute was tested.  Each new page of a scaffold was to insert the given information into the database for a record visible from the other three pages.  Show displays the data.  To elaborate, media files in the form of video or image must be present.  Social media link should go directly to the external page of user’s input.  Edit will allow you to change any attribute available from the form.  Index is sort of a home page for all records found in the database of a given class.  This sequence was repeated for each scaffold created.  Heading styles and the application’s navigation links should also be reliable as designed from page to page.  Sample image from fileload page is attached below. 

Integration Testing
Database was scripted to have the login object’s id be a foreign key of the social media, visual media and student activity objects.  This was to ensure that all information associated with a specific login id would appear on its page.  Once a login was created, the tester would go to the show page.  From show, I could then go to one of the other scaffolds to create a new object.  The login id would be a hidden field passed as an inherited session variable.  Returning the login’s show page should now display the newly created video file, image file, Instagram link, facebook link, twitter link and activity update.  Additional route coding was added to the routes page to create the appropriate link for the necessary HTML actions.  Images are added below from the first test with internet explorer where the visual media will play.  


Second image is from Mozilla firefox where activities were added but video is not supported.

System Testing
A review of overall system performance was the final stage of system testing.  Unit testing and integration testing made this simple.  Aside from review, database was checked to see if information was entered and updated.  This was completed by inserting data via the website and using the sqlite terminal to check the tables. 

Testing Results
 In the early stages there were several errors as misspellings would lead to objects not found errors.  Working through those sorts of bugs allowed for the more technical aspects of the intended design.  There was a lot of trial and error involved as changes were made followed by running the application to see the effects.  One regret was not getting the user authentication page working.  Authentication was supposed to check for the entered login name in the database.  If found the next step was to take the encrypted password found, decrypt it and match it to the user entered one.  One of the problems encounter was creating such a record would be non-efficient duplication in the database.  If the information was not entered into the database, it would not accept a direct match as the primary key was not found using the implementation of a session variable.  If I were to use a non-session variable, user would just be sent to an empty login screen.  Other than that, the final product performs to a very close proximity to what was expected.  There were some flaws of videos only viewable in mp4 format from the internet explorer browser.  But I was not able to modify that from a design standpoint.  It was maybe a conflict with the development environment or source coding.  Also there was not enough time for mobile conversion.  Overall the application meets many of the requirements requested are the final results.

High End Computers and Computational Science

The term “high end computer” will have varying definitions.  Most of the creations which fall under that label are between the early traditional or simple pc and very large supercomputers.  In fact, as personal computers became more popular and readily available, the prices also went down.  Then generic or cheaper alternatives were designed as ones with increased performance and capacity began to hit the market.  The blending of the affordable pc with the superior supercomputer provide a machine where the possibilities for what it can do are nearly endless and it can fit in one room of your home.  If a uniform high end computer can be designed, it will involve the computer being expensive and having elite processing capability.  The cost and performance will go hand in hand as processors, memory, graphics cards and hard drive will play directly into both.  To further elaborate, we are talking about processors with multiple cores with speed measured in gigahertz, synchronous dynamic memory with large amounts of gigabytes dedicated to active services and applications, very high capacity graphics cards and solid state drives that can hold terabytes of data [1].  The trend in technology is the new always seems to fade out the old.  Usually the intricate differences are so subtle that only experts of the field can explain what truly separates them.  The latest technological devices come at a higher cost than the previous iterations and the enhanced performance is just part of the package.  These computers are commonly associated with research experiments, gaming and software development.  One more industry which uses high end computers is Computational Science.  Computation Science is a budding field that draws a line to not be confused with Computer Science.  From an article posted in the SpringerPlus journal, the author defines it as “being the application and use of computational knowledge and skills (including programming) in order to aid in the understanding of and make new discoveries in traditional sciences, as opposed to the study of computers and computing per se” [2].  That classification describes how computers can be used in a variety of ways to assist with STEMM projects.  STEMM stands for science, technology, engineering, mathematics and medicine.  Aside from the somewhat cross referencing of technology, computers are primarily used to process large quantities of information, break down and solve complex equations and store relevant records.  Where computer science focuses on the components of hardware and software computational science focuses on the uses for other fields.  The interweaving of high end computing and computational science enables a very formidable tandem which has advances computing many areas. 
            One of the primary reasons for these collaborators, as previously stated, is research.  When research needs to produce results it can be in the form of data visualization.  After gathering so much information, reporting has to be condensed to the facets of the study.  The numbers are broken down in divisions of categorical distinctions.  From the vast total the research can generate representative parts of the sum.  Data visualization is the process of converting the information into models that can be interpreted by other individuals who are not certified experts of the field.  A component to Computational Science can be the talent of reproducing graduate level research to be received at a novice’s perspective.  Though they will certainly be able to understand it as well, it is intended for a larger group.  Data visualization has been known to help both very effectively.  The following excerpt from a high end computing publication give some hints into the procedure.  “The scientific method cycle of today’s terascale simulation applications consists of building experimental apparatus (terascale simulation model), collecting data by running experiments (terascale output of the simulation model), looking at the data (traditional visualization), and finally performing data analyses and analysis-driven visualization to discover, build, or test a new view of scientific reality (scientific discovery) [3].”  The computers assist on each integral step.  The designing of the model and how it will operate is just as important as formally collecting the data.  Computable models are the models that are of prime interest in computational science. A computable model is a computable function, as defined in computability theory, whose result can be compared to data from observations [4]. The model design can be about deciding the actual form of the input and output.  Input can be as simple as inserting numbers into preset forms or multifaceted like providing hard files and having the data parsed.  Then telling the computer what to do with all that evidence is the computational side of the output.  For data visualization, output will be a chart, graph or map.  Researchers can present the details of why the model’s data is isolated to create the image it does. Some of the variations include bar charts, histograms, scatter plot chart, network model, streamgraphs, tree maps, gantt charts and heat maps [5]. Would there be a loss in quality or relatability if a histogram was chosen over a gantt chart for example?  Discuss the details of what validates the choosing of evocative data sets can be very interesting niche for this study. All of these, however, use the power aptitudes of high end computers to create high resolution images with rich colors and niceties as the result for computational science.   These images easily translate pages and pages of raw data to a much more digestible format.  Research that may have taken years and thousands of contributors can be reduced to a two dimensional appearance viewable from one page.  
            Three more examples for applications of this combination is parallel computing, grid computing and distributed computing.  From a computation science perspective, grid computing generates individual reports from locations as the information is fed into a primary resource.  Distributed computing would rather produce a comprehensive report using multiple network node to collect data.  Lastly, parallel computing uses one main source to assist all the others nodes of a network.  Grid computing and distributed computing function in a very similar as the network architecture is almost identical but the end result as well as root cause can be different.  A perfect illustration of this concept is the BOINC program at the University of California, Berkeley.  BOINC is an acronym for Berkeley Open Infrastructure for Network Computing [6].  Its initial release was in the year 2002.  Since then it has cycled through many participants who readily volunteer their computers and services for the research cause of their choice.  And there are many to choose from.  A recent check of the website displayed over 264,000 active volunteers using over 947,000 machines spread out of almost 40 projects.  Projects vary in topics from interstellar research and identifying alien lifeforms to what could be the next steps in advanced medicine.  Each one has volunteers dedicate time and resources to what they have interest in.  Each person can go through the process of completing an agreement form and download software to begin to be a part of it.  They use several high end computer and maybe not-so high end computers across the network to receive data at an alarmingly fast rate.  Systems of this type are measured in floating point operations per second or FLOPs.  That unit of measure would most likely fall under the category of a performance metric [7].  High end computing can be measured by the application, the machine or the combined integration configuring performance metrics.  Instruction sets of the application processed by available sockets and cores of the system during a given clock cycle can lead to a number for how many floating-point operations are conducted.  To be precise the FLOPs amount calculate by the BOINC structure are converted to petaFLOPs.  The peta- prefix denotes a calculation of 1015.  That quantity is possible from the immense shared space and very little idle time or mechanical malfunctions.  That is an enormous number.  To further explain, imagine speed you would have to move at to do a task a million times in one billionth of a second.  When the data collection is time sensitive errors can arise from user authentication and system authorization.  People gaining access to the network and the network having access to the computer can be viewed as human error if not completed correctly.  Another area to ensure is security to prevent interception or modification of information as it is transmitted.  Data needs to be protected as it is passed from node to node.  In many of the projects the retrieved data is geo-specific, any misrepresentation or alteration of the records can seriously corrupt the reporting of the final statistics.  Encryption and decryption can play a major role the security of the project.  There are countless methods to do this but it will most likely ensure a way for the message to be encoded leaving the home location and decoded only once it reaches its intended destination.  Accuracy is critical to enterprise level operations of this field. 
            Computational Science and High End Computing are the proverbial match made in heaven.  Together there is a give and give sort of relationship to where the prospects of each are enhanced by the other.  They are not completely inseparable however.  Gaming is a major industry for high end computing.  The frame rates of today’s video games are only possible with certain machine and graphic cards.  The minimum requirements for games and application are sometimes spoken about beforehand.  Computations can be done by human beings given sufficient amount of time.  Computers have not nearly been around as long as science and math.  People of these fields have performed in them for centuries.  The majority of advancements that reach the mainstream begin from a human proposed thesis and assisted with, not dependent upon, technology.  But the merger is what allows for the maximizing of effectiveness and time spent.  The combination has expedited and enhanced research in several fields.  It has also been able to broaden the possibilities of what can be done.  Results can be recreated into graphical representation to discuss.  The data conversion provides a visual to better comprehend these often large sets of raw numbers.  In conclusion, as a student of Computer Science I think that the one of the most astonishing feats may just be the idea of the pair growing into its own genre and not within the borders of the CompSci discipline.  I am equally impressed by the component materials needed to build and modify a high end computer as its usage for mathematical and scientific applications.  Hopefully both will continue to flourish in the future with their ingenuity and popularity.  And the next evolution maybe just around the corner.

References
[1] Origin PC Corporation. https://www.originpc.com/gaming/desktops/genesis/#spec2
[2] McDonagh, J, Barker, D. and Alderson, R. G.. Bringing Computation Science To The Public. SpringerPlus. 2016.
[3] Ostrouchov, G. and Samatova, N. F. High End Computing for Full-Context Analysis and Visualization: when the experiment is done. 2013
[4] Hinsen, K. Computational science: shifting the focus from tools to models.  F1000Research.  2014.
[5] Data Visualization. https://en.wikipedia.org/wiki/Data_visualization.
[6] http://boinc.berkeley.edu/
[7] Vetter, J., Windus, T., and Gorda, B.  Performance Metrics for High End Computing.  2003.

Report on Scala

                Scala is a program language created by Mr. Martin Odersky and others.  It is intended to be an elegant blend between object-oriented design and functional programming.  There is a belief within its ranks that every function is a value and every value is an object.  Development began on the language in 2001 followed by the initial release in 2003.  After an attempt to improve Java, the project spun off into its own formulated specifically for component style software engineering.  The data types of the language are common.  Numbers can be in the form of doubles, floats, longs, ints, shorts and bytes.  Strings, long form sequence of alphanumeric or Unicode characters, are from the Java library.  There are also individual characters as chars and true/false values as Booleans.  There is some uniqueness as units, iterables, maps, options, sets and lists are added to the mix of instantiable data types.  Even “empty” sets are possible with nothing and null objects.  Scala has five primary keywords for creation as class, object, def, val and var are needed to identify user definitions.  Classes are the design for what an object can be while objects are just a single instance.  Another keyword new is used to convert the class to a created object.  The main method to run an application is customarily contained in a user defined object.  The def keyword is used to create functions.  The keyword is followed by the name, the parameters in parentheses, a colon, the return type, an assignment operator and the set of instructions contained in brackets.  Val signifies a place holder where the assigned value will not be changed while var can be modified later in usage.  The syntax calls for either the val or var keyword followed by the name then colon, datatype and finally with what is being assigned to it.  Scala also has user defined types.  The keyword type is placed before the name and then you assign a predefined type with the assignment operator.  As file organization goes, Scala is very similar to Java in the matter of naming convention.  Also like Java, Scala uses the keywords package and import to define project scope and add external files respectively.  Scala features two forms of generics.  One is the traditional abstract class where you provide a simple framework to be used later.  Very analogous to abstract classes is the concept of traits.  Traits also allow variables and methods that are inheritable to another class.  The inheritance occurs with the keyword extends.  The capability of concurrent processing is very heavily associated with the ‘java.util.concurrent’ package.  There are two points to this topic.  The first is having two library methods of callable which returns a value and runnable that does not.  Then you will have to look at various of threading possible with synchronous and asynchronous tasks.  Where Scala shows promise is in how vast and dynamic it can be.  The possibility to implement nested functions is another positive addition.  For me, a drawback is how closely relatable it is to Java and running on a Java virtual machine. 
                The difference between Scala and C would be akin to comparing Java and C.  Start with the time period to get a better understanding.  C was created in 1972 and Scala in 2001.  C is the basis for many programming language concepts since its inception.  Scala is a fairly newer language and is heavily dependent on Java.  Much of the grunt work that developer had to do in C has been made much easier within built-in libraries for Scala.  The major difference you could point to would be the same in comparing other languages.  All languages will have a particular syntax for committing common tasks.   The level of complexity is dependent upon the features included into the language but again they maybe more time dependent as concepts were created by pioneers and deciphered to be made simpler for users to come later down the line.  To complete project one in Scala would be helped by the automatic sorting array feature but much of the rest would be similar.
References
[1] http://www.scala-lang.org/index.html
[2] Odersky, M. et al. An Overview of the Scala Programming Language.  EPFL Technical Report. 2nd Edition. 2004.
[3] Odersky, M. Scala By Example. 2014.
[4] Odersky, M. A Brief History of Scala. http://www.artima.com/weblogs/viewpost.jsp?thread=163733. 2006.
[5] Venners, B. and Sommers, F.  The Origins of Scala. http://www.artima.com/scalazine/articles/origins_of_scala.html. 2009.
[6] https://www.tutorialspoint.com/scala/index.htm
[7] Concurrency in Scala. https://twitter.github.io/scala_school/concurrency.html

Sunday, December 18, 2016

Right to Privacy

I value my privacy but I have learned at this point of my life that revealing information has its price.  Privacy is more about simple respect than having something to hide.  Sharing creates the glass door effect that illuminates others to your presence willingly.  Privacy should not be a concept to be violated unlawfully.  It is a right afforded to every United States citizen implicitly by the constitution and specifically with the Privacy Act.  Sadly, the same cannot be confined of every citizen of the world.  This topic can be sequestered into a human right debate when honestly assessing what the government of a country should have access to.  But I will try not to the veer off into to the deep end too much.  Personally for me it is my creative space.  Being left alone to think allows me to focus and put forth my best efforts.  In a group or public setting, I can function in a read and react situation but many times my initial response will not address the underlying crisis or the overriding subsequent steps that surely will follow.  I need significant time with myself to consider a solution to detangle the utmost conceivable possibilities.  So for me, solitude is very important.  Isolation can be a form of self-seclusion for a positive impact.  Loneliness is the negative connotation of privacy that demonstrates a quarantining affect.  However, we also live in a time where people easily waive that right with the advent of social media.  You can find everything from nude pictures to intimate stories on the world web wide.  I guess whatever you like you can find in whatever way you see fit.  Sometimes you can even find out what you do not like.  Privacy is such an issue where individuals can comprehend people get angry from reading what another has written.  Personages can go to someone’s webpage and tell them they are offended with what was seen.  Freedom of speech is only preferred when someone agrees your words are feasible and worth their acceptance.  It’s not about just being yourself anymore.  It is really about appealing to the most people possible.  Like-mindedness can become group thinking given enough time and space. 
                I cannot envision the situation being reversed where forcibly imposing yourself on someone can be beneficial to all parties.  Physical attacks or assaults are common criminal offenses in many regions of the world.  Invasion of privacy in the form of computer hacking and videotaping are real as well.  Cases have been presented in court where information or images were released without the aggrieved party’s appropriate consent.  There are other forms with frequent occurrences but as stated earlier I will try not to make that the center of the discussion.  I suppose it might be a crucial element from a law enforcement perspective.  When a threat is identified, it will have to be intercepted and nullified as soon as possible.  Warrants along with probable cause sparks the negative sort of interest at which your personal space can and most likely will be invaded.  So abiding by the law will procure each of us the safety and security that we enjoy.  While breaking the written law can eventually have your rights revoked is the moral of the story.

Saturday, December 10, 2016

Supplements For Health

My experience with herbal medicine is a very positive one.  Now admittedly, more often than not, Doctor prescribed medicine is the best way to resolve health issues.  This is not written to condemn modern pharmaceutical drugs.  It will be to provide insight into alternatives for less severe ailments that you might experience from day to day.  Shopping online from sites like Amazon or in person at stores like Vitamin Shoppe and Wal-Mart can deliver very affordable supplements that can improve your overall health. 
                Let’s begin with some of the basics.  Many people cope with various forms of pain.  Everyone from your strongest professional athlete to maybe not-so-in shape person with a desk job where they perform at a computer for eight hours a day.  Mild pain relievers like Aspirin or Tylenol are readily available all over the world.  Extended usage in terms of time or over usage in terms of dosage can leads to more health concerns.  An alternative is ginger.  Ginger is an excellent anti-inflammatory and pain reliever when taken the proper dosage.  It also has anti-microbial properties that can help your immune system fight bacterial, viral and fungal infections.  For individuals like me, it provides immense assistance as my system is very sensitive to aspirin.  When taken low doses I can experience gastrointestinal problems and higher doses will worsen conditions which other ailments will show. 
                Additional supplements like milk thistle, garlic and cinnamon also provide exceptional health benefits.  Milk thistle focuses on the healing and regeneration of the liver.  The liver is essential to the body for fluid circulation, protein synthesis and maybe most importantly detoxifying your body.  Sometimes when feeling ill taking milk thistle gives that magical cooling feeling the gives indication that something is healing.  A healthy liver is mandatory for your anatomy and spirit.  Garlic is similar to ginger in terms of being an anti-microbial agent.  My interaction leads me to believe garlic is stronger.  Also many garlic supplements are in the form of concentrated oil which moisturizes the heart and patches up your arteries.  It helps with acne problems and cholesterol as well.  Cholesterol has its good and bad forms.  Garlic can lower the bad form of LDL by chipping away at it as it passes through your system.  It will also increase the good form of HDL, as previously stated, in improving the condition of damaged arteries.  Cinnamon also has anti-microbial properties.  There are forms of the supplements that are combined with chromium III picolinate.  The combination is very supportive for maintaining blood sugar for diabetics and weight loss.  Counter balancing the sugar in your body is a main function of cinnamon.  Reducing it can increase your metabolism.  Once your metabolism is sped up then your body can work at peak performance.  The cliché of firing on all cylinders fits in here. 
                Three other supplements I would like to add to this entry is soy lecithin, melatonin and acai berry.  Soy lecithin is a form of protein as a supplement.  With its basis in soy products, it can be primarily for vegans and vegetarians getting a quality source of protein in their diet.  Melatonin is a sleeping aid for whenever you start experiencing restless nights.  Taking can enable a deeper and longer night’s rest.  Acai berry assists in removing stubborn waste from your body.  Additional water and sodium in your body causes bloating and overweight appearance.  Acai berry really helps in “flushing” those substances and other toxins out of your system when taken consistently. 
                I could write about a few more substances but I do not want to be confused with an expert.  I have crossed paths with supplements like goldenseal, pau d’arco, and collagen.  Mostly for infections and skin conditions.  But I will not elaborate because maybe a physician or dermatologist can speak about something which can be more effective.  There really is not a middle ground when it comes to health.  Some substances that do not cure will strengthen the disease your battling.  What I discussed was helpful but is not a substitute for your doctor’s advice.  Supplements will provide significant steps towards progress.  And that alone can be worth taken into consideration.