Hit The Order Button To Order A **Custom Paper**

>> CLICK HERE TO ORDER 100% ORIGINAL PAPERS FROM AustralianExpertWriters.com <<

23 Dec
2019

CO4754 User‐Centred System Design &… | Good Grade Guarantee!

Page 1CO4754CO4754 User‐Centred System Design & EvaluationAssignment 2 – EvaluationWebsite Usability Evaluation for a Large Further Education CollegePage 2CO4754Website Usability Evaluation for aLarge Further Education CollegeABSTRACTThe further education sector is a highly competitive environmentwith organizations operating on very low margins; recruitment ofstudents is a vital activity not only to meet organizational goals,but literally for the survival of the organization. The organizationwebsite provides a cost effective platform to attract and recruitlearners, including assessment of their needs and theorganizations ability to using public funding to cover the coursecosts. The website therefore becomes a critical part of thebusiness; the usability of the website will directly affect therecruitment numbers and the success of the business. For largerorganizations delivering learning across all subject areas, at alllevels from pre-GCSE to masters degree across a number oflocations, this presents a challenge for the web platform. Thispaper will assess the usability of the recently deployed CornwallCollege Group website and recommend areas for improvement.Categories and Subject DescriptorsH.5.2 [Information Interfaces and Presentation]: UserInterfaces – Evaluation/methodology, Interaction styles, Screendesign, User-centred design;General TermsDesign, Human Factors, Measurement.Keywordsinterface evaluation; think-aloud; usability evaluation; furthereducation;1. INTRODUCTIONThe website of a Further Education College provides the firstexperience of the organisation for many learners. The website hasmany functions such as allowing online applications andenrolments, course search and comparison, as well as providingorganisational information. The primary purpose of the website issales; in this context success is getting learners to make contactwith the College so that the various teams can engage with thelearner and find an appropriate course for them.It is vital that the website does not hinder the process and goesbeyond meeting the functional requirements by meeting the userneeds simply. Given the broad range of students based on theirage, experience and cognitive ability and the complexities of theUK Further Education system, this presents a real challenge.This paper reports on a usability study of the primary website ofremoved. In particular, it reviews the literature concerningwebsite usability assessments, describes the evaluation methodsused and summarise the results including recommendations forimprovements.2. BACKGROUNDRemoved College Group is one of the largest Further EducationColleges in the UK delivering learning to over 20,000 learnerseach year. The Group encompasses three main brands, removedfor anon.The College is focused on delivering exceptional customerservice; part of this strategy is the provision of self-service onlinetools enabling students to manage their own details. The website(removed) forms part of the self-service approach as the startingpoint for the learner’s journey. Following completion of a replatforming project 12 months ago, recent internal feedback hasindicated that there may be usability issues that are causingpotential learners to go to other providers, hence losing theincome for the College and credibility.The College Information Systems team have been activelymonitoring website usage since it was launched, primarily usingGoogle Analytics and log file analysis. This has resulted in quickchanges where it became obvious that certain pages, such as thelogin page, were preventing learners from submittingapplications.Since the launch of the website the College Information Systemsteam have developed a better understanding of the website usersand their goals through the adoption of goal based design [3], it istherefore appropriate to review the website in the context of thisnew knowledge.3. LITERATURE STUDY3.1 Usability of websitesDefining usability is not simple, there is a standard definitionprovided within ISO 9421-11 [12]:Usability is effectiveness, efficiency, and satisfaction with whichspecified users achieve specified goals in particular environments.This definition is useful in that it refers to specific users andgoals, tying together the usability definition with a goal-baseddesign approach as described by Cooper [3]. However, theauthors of much of the HCI literature have differingPermission to make digital or hard copies of all or part of this work forpersonal or classroom use is granted without fee provided that copies arenot made or distributed for profit or commercial advantage and thatcopies bear this notice and the full citation on the first page. To copyotherwise, or republish, to post on servers or to redistribute to lists,requires prior specific permission and/or a fee.CO4754’14, Jan 24, Preston, England, UK.Page 3CO4754understandings of what this means in a web context. BenhunanFich identified in a web context that usability can be defined as“how well and how easily a user, without formal training, caninteract with an information system of a website” [1]. Thisdefinition highlights important aspects of websites that are forpublic consumption, usability is considered for users withoutformal training, the interface should be obvious and intuitive,leading the user through the site. Bernard et al. suggested that a“true usable system must be compatible not only with thecharacteristics of human perception and action, but, mostcritically, with users’ cognitive skills in communication,understanding, memory and problem solving” [2]. Thehighlighting of different user abilities has particular relevance to afurther education college website, since the website will beaccessed by users with widely varying cognitive skills rangingfrom pre-GCSE to masters degree students. Returning to the ISOdefinition, if the website is to be usable for a pre-GCSE learnerand masters degree student it must satisfy them both whenenabling them to achieve their goals.Kincl and Stach [14] focus on the ISO definition pointing out thateffectiveness and efficiency are necessary yet insufficientconditions for attaining user satisfaction. Lindgaard [15] statesthat it is not the content of the website but the emotional reactionto visual appeal. Tractinsky and Zmiri [24] point to empiricalevidence that users can be more satisfied with a visuallyappealing website than with a website that is usable but lessvisually attractive. Kincl and Stach [14] apply Herzbergs’motivation theory [9] to explain these findings with thesatisfaction rating of the website being the sum of satisfactionvariables, grouped following Herzberg’s theory into dissatisfiers,satisfiers and hybrids. This helps to understand that there areelements of the interface that if not present or inadequate willdissatisfy the user, those that satisfy the user and those that areneither negative nor positive but must be present.Garrido et al [7] defined website usability in terms of the factors,accessibility, navigability, effectiveness, credibility,understandability, customisation and learnability. The definitionsof the factors are very specific and therefore particularly usefulwithin a software engineering team that do not have the benefit ofa usability expert.3.2 Usability Evaluation of WebsitesThe effectiveness and efficiency components of usability asdefined in ISO definition lend themselves to be being measurableand therefore assessed in terms of utility and ease of use [12].Rosson and Caroll provide evidence that testing of websites needsto be based on typical scenarios and tasks[22].There are many sources with lists of usability evaluation methods(UEM), however, Hom’s The Usability Methods Toolbox [10]provides a comprehensive review of methods with clearexplanation of how to conduct the method and when its’ use isappropriate. An omission is the use of software or automatedevaluation, possibly due to the age of the toolbox.Hasan et al [8] categorise UEMs in terms of how the usabilityproblems are identified: for example by users, evaluators orsoftware. User-based UEMs usually involve users being observedundertaking pre-defined tasks. Evaluator-based UEMs involvehaving a number of expert evaluators assess the user interface tojudge whether it conforms to a set of usability principles, knownas “heuristics” [17]. Software-based UEMs use software tools toidentify usability problems such as Google Analytics. It involvescollecting, measuring, monitoring, analysing and reporting webusage data to understand visitors’ experiences. Their studydemonstrated that they have some limitations in the evaluation ofthe usability of web sites, relating to the fact that the web metricsindicated only a potential usability issue. They could not providein-depth detail about specific problems that might be present on apage, this requires user testing and/or heuristic methods.Nielsen and Molich [19] focused on evaluator-based UEM andsummarized that the ideal number of evaluators required toidentify 80% of usability problems was 3-5, however furtherstudies by Hwang and Salvendy [11] demonstrated that whereevaluators only have basic training in heuristics then this numbershould be 10±2. Hwang and Salvendy’s study was felt to beparticular relevant for this study as the evaluators being usedwould be new to the heuristic evaluation method.Ivory and Hearst [13] provide a detailed analysis of usabilitymethods, highlighting that findings can vary widely whendifferent evaluators study the same interface, even when using thesame technique. The wide variation is a concern, with less than1% overlap which they postulate implies a lack of systemacity orpredictability in usability assessment. For these reasons, theyconcur with Dix et al [4] and recommend that several differentevaluation techniques should be applied. They highlight theopportunities that the web platform provides, especially automateusability testing. Automating the testing will reduce testing costsand providing quicker feedback on alternative designs.Tan et al [23] confirm that effective evaluation of usabilityrequires the use of multiple methods. Their study focuses on acomparison of findings for heuristic evaluation versus usertesting. Both methods were found to be equally effective inidentifying different usability problems related to five categories(navigation, information content, layout organisation andstructure, usability and availability of tools and common look andfeel) but user testing did not identify problems relating to twoissues (compatibility, and security and privacy issues).Therecommendation is that the methods need to be applied at theappropriate time in the web development lifecycle, with heuristicevaluation more effective in the early stages and user testing inthe later stages.Prom [21] investigated the use of Google Analytics to evaluateand improve the design and content of web sites. The study usedstandard reports from Google Analytics (i.e. funnel navigation)without deriving specific metrics. Analysis of Google Analyticsdata enabled problems to be identified quickly and helpeddetermine whether a site provides the necessary information to itsvisitors. This study has particular relevance, as Cornwall Collegehas collected usage data using Google Analytics for a number ofyears; the study suggests that analysis of the data will helpidentify usability issues.3.3 Usability Evaluation FeedbackSince the aim of this research is to provide feedback on identifiedusability issues, it is appropriate that methods for providing thefeedback are considered. Nørgaard and Hornbæk’s [20] studyinto the effectiveness of usability feedback explores howdevelopers respond to a range of feedback formats. Theirrecommendation is that feedback is given through multimediapresentation, the screen dump format (aka screen snapshots [10]),Page 4CO4754or redesign proposals. This study will provide feedback asredesign proposals or screen snapshots of problems.4. RESEARCH METHODOLOGY4.1 Selection of UEMThree UEMs were selected, heuristic evaluation, GoogleAnalytics and CrazyEgg heatmaps, this was based on evidencethat these methods are complementary in that they are able toidentify usability problems from different perspectives ([8], [18],[23]). The missing UEM is user testing which is a potentialsignificant omission but could not be completed within thetimescales of this research. Google Analystics was used since ithas been in use with the College for a number of years and a goodamount of data exists, the application also has a wide range offeatures and benefits (i.e. a usable and simple interface). CrazyEgg is similar to Google Analytics, but provides more insight intoexactly where users are clicking on a page and what parts of thepage being view.4.2 Research DesignThe design of this research is based upon:1. Expert evaluation using Nielsen’s 10 heuristics [17]. Eachevaluator had 1 hour to complete 3 tasks as described in therecently developed persona scenarios [3]:a. Using the persona of John the School Leaver, submit anapplication for a public services course starting inSeptember.b. Using the persona of John the School Leaver researchwhat grant funding is available.c. Using the persona of Alice the Recreational Learnerenquire about the cost of an evening intermediate Italiancourse at Camborne.2. In addition to the classic method of heuristic evaluation,problems were not only recorded, screen snapshots Error!Reference source not found. were also taken to help thedevelopment team replicate, identify and resolve anyproblems.3. A procedural model drawn from the work of Masemola & deVilliers [16] :a. Set up objectives in line with research questions.b. Determine the aspects to be measured and their metrics.c. Formulate documents:Initial test plan, task list, information document forparticipants, checklist for administrator, and determine ameans of investigating satisfaction.d. Acquire participants.e. Conduct usability test.f. Determine means of analysis and presentation thataddress the unique, as well as the usual, aspects.g. Draw conclusions and make proposals for the wayforward.4. Software evaluation using Google Analytics followed theapproach suggested by Hasan et al [8] but was limited to userflow analysis and user drop offs.5. Software evaluation using Crazy Egg was completedfollowing instructions on the tools website. Particularaspects that were studied were page heatmaps indicating userclicks and areas of pages that were viewed.5. DATA COLLECTION AND ANALYSIS5.1 EvaluatorsExpert evaluators were enlisted from the Information Systemsteam and given 30 minutes of training on heuristic evaluationbefore beginning. All the evaluators have significant experiencein developing web based user interfaces. Although heuristicevaluation is usually performed by usability experts, theevaluators had a good fundamental understanding of webusability, and their involvement in the study would provide avaluable contribution to their personal development[5]. Hwangand Salvendy [11] predict that where evaluators have only hadbasic training then the number of evaluators required to identify80% of usability problems will be 10±2 rather than 3-5 identifiedby Nielsen and Molich [19], for this research only sevenevaluators were available.5.2 Ethical considerationsAlthough the College website has been implemented by a thirdparty, members of the information systems team had beeninvolved in elements of the design and integration. One memberof the team had a lead role in the design and was therefore given aspecial briefing before the research was conducted to explain theobjectives and approach and that the exercise was not anevaluation of their performance or skills. Whilst the team memberwould be present during the evaluation, they did not participateinstead acting as an observer and assistant.Neither software evaluation method collected any data that wouldenable the identification of individuals; therefore there were nopersonal data issues with these methods. However, the privacyand cookie policy of the website were checked to ensure thatusers were aware that tracking tools were in use for the purposesof improving user experience.5.3 Study5.3.1 Expert EvaluationThe evaluators were given the three tasks to achieve together witha worksheet to record any violations of the heuristics. The studywas completed using the live website, the evaluators were askedto follow the task to completion except to stop before submittingan actual application or enrolment to prevent the study affectingother staff.The study took 1hr 15 minutes for all evaluators to complete. Oncompletion of the tasks, the evaluators merged together theirfindings assigning a priority to each violation identified.5.3.2 Software EvaluationGoogle Analytics was already in use so no configuration wasrequired. The analysis used data for the period 11 December2013 – 10 January 2014.Page 5CO4754Crazy Egg was not installed prior to the research. Setup wassimilar to Google Analytics and required the insertion of a smallJavaScript snippet into the website template. The analysis useddata collected over a 10 day period at the beginning of January2014. User click tracking was established on the landing pageand all pages on the level below the landing page. Additionallytracking was enabled on the course search and details pages sothat the users expected journey through the website was tracked.5.4 Evaluation5.4.1 Expert EvaluationExpert Evaluation identified 29 usability issues with the website.Consistent with studied literature there was little overlap betweenthe issues identified by the different evaluators. Table 1 showsthe number of detections of each issue and it’s assessed impact.Table 1. Issue detection frequency and impact.
No ofevaluatorsreportingissue
LowImpact
MediumImpact
High Impact
1
8
13
6
2
1
4
1
The issues detected were across the range of heuristics, Table 2summarises the number of issues violating a particular heuristicagainst the impact of the violation.Table 2. Number of heuristic violations by impact.
Heuristic
LowImpact
MediumImpact
HighImpact
Visibility of systemstatus
2
Maximise match betweenthe system and the realworld
2
1
1
Consistency andstandards
1
4
Error prevention
1
Recognition rather thanrecall
1
1
2
Flexibility and efficiencyof use
1
2
Aesthetics and minimalistdesign
1
4
1
Help users recognise,diagnose and recoverfrom errors
1
2
Help and documentation
1
The major usability problems and possible or implementedsolutions are: Users receive error message pages showing codes suchas 400 or 500 with no indication of what they shoulddo. The codes are not at all useful to a user, theoccurrence of the errors needs to be investigated andremoved, but when an error occurs the user should get asimple message asking them to try again or contact theCollege using another method, such as live chat,telephone or email. A Christmas Closure Banner was obscuring 1/3 of thescreen during the testing. This was caused byunintended use of a feature designed for significantabnormal events such as closure due to flooding orsnow. The banner concept is useful but needs to befurther developed so that once a user has read themessage they can dismiss it for the duration of thesession. When a user has created a basket with a combination ofquestions, applications and enrolments on courses, thequestion screen gets stuck and prevents the user fromcontinuing. This is a bug within the web site that needsto be fixed as a matter of urgency. When a user asks a question about a course thevalidation does not provide feedback to the user if theyhave not completed all the required fields, it justprevents the user from proceeding. The feedback needsto be fixed. The course search has an auto populate predictive listthat appears as the user is typing. The list that appearsis not wide enough to show the full course title with theimportant details being hidden. Consideration needs tobe given to resolving this by either increasing the sizeof the list or perhaps abbreviating obvious words.All full list of findings is at Annex A.5.4.2 Software Evaluation5.4.2.1 Google AnalyticsAnalysis of Visitor Flow revealed that the most popular routethrough the website is:

Landing Page – /Search Results – /search/coursesCourse Details – /courses/
5% (363 users) iterated at least 10 times between Course Detailsand the Search Results before leaving the site with 40% of usersleaving after the first iteration.The next most popular flow is to the Adult Hub page and thenfollowing the search results to course details flow.Figure 1 – Landing page screenshot two search boxesPage 6CO4754The third most popular flow is of interest:

Landing Page – /Site Search – /search/siteCourse Search – /search/coursesCourse Details – /courses/
Users appear to use the site search and then use the course searchhaving not got the results they were expecting. Figure 1shows thelayout of the search boxes on the landing page.The major usability problems and possible solutions are: This layout should be improved by hiding the input boxand displaying a site search link that when clickedexpands into a search box so that the user is notpresented with two areas to input data. This affect canbe seen clearly on the Google home page with therecent change to hide Google Apps as shown in Figure2.Figure 2 – Google Apps hidden unless clicked

The search is really the core focus of the page, thecursor could be automatically focused in the course
search input box. This will also enable regular users tostart typing without having to click on it [7], as well asdrawing the users’ attention to the search. The website is used as a landing page within theCollege for users logging onto wireless and some usershave it set as their home page. This makesunderstanding the Google Analytics data difficult as asegment of the initial drop off is likely to be caused bythis. It is therefore recommended that wireless logonand other internal processes that redirect to the websitehave specific landing pages created to exclude themfrom future evaluations.5.4.2.2 CrazyEggThe heatmap of user clicks in Figure 3 reveals that all of thenavigational elements of the website landing page are used,however the most used parts are:

Course Search

Site Search

Student Portal

Leisure Course Hub Page
In terms of usability problems the heatmap supports the GoogleAnalytics analysis that the site search is used regularly, probablyinstead of the more appropriate course search.A combination of the hub page heat map and scroll mapconfirmed a usability issue that was raised through expertevaluation but was not clear within the Google Analyticsevaluation.Figure 3 – CrazyEgg heatmap of user clicksWhen users click to access a hub page, such as “University”shown in Figure 4, the area of the page that is visible is just thelarge images. The circle representing the hub page in thenavigation menu has increased in size but it is not significantenough for users to realise that anything has changed; theytherefore click the navigation link again. Once the user realisesthat they were already on the hub page it is likely they will feelstupid[3] and it is likely that is would become a dissatisifer[14].Page 7CO4754Figure 4 – Cropped Heatmap for university hub pageThe visualisation of where the user is needs to be more obvious,perhaps removing the large banner image altogether from the hubpage. An alternative solution could be to make the colouredhorizontal bar element of the navigation menu thicker and removethe white border from the current menu so that the current page ismore obvious. Both of these should be tested using Crazy Egg tosee if the number of clicks to access the already selected page canbe reduced. Figure 5 demonstrates a mock up of theimprovements.Figure 5 – Navigation ImprovementsThe heat map in Figure 4 also shows that once on the hub pageusers then go to the course search which they could have orshould have done on the first page. Recommended improvementsto the visualisation of the search many improve this, although userevaluation should be conducted to understand the behaviour andconfirm if this is a usability problem. There is evidence of thisproblem across all hub pages.6. CONCLUSIONS AND FUTURERESEARCHThere are clearly a small number of high impact usabilityproblems that need remedying urgently. The evaluation has alsorevealed and/or confirmed a number of usability issues that whilstnot preventing users from achieving their tasks, they are no doubtacting as dissatifisers such as clicking a navigation link twicebecause it is not clear that anything has changed.The notable omission from this evaluation is the use of usertesting. This should be done as future research as it is likely tohighlight additional usability issues.Each of the various evaluation methods used contributedsomething different in terms of understanding usability problemswith the website. The expert evaluation probably significantlyunderreported the possible issues due to the inexperience of theevaluators with heuristics; however, the method is useful andshould become a regular part of the information systems teamtesting.This research has only started the analysis of the available datacontained with Google Analytics and Crazy Egg, more resourcesshould be allocated to understanding how these tools can be usedeffectively. Additionally, resources should be allocated toinvestigating how current tools for automating aspects of usabilitytesting can complement the methods used here.It is recommended that the use of all three UEMs continue withusability evaluation becoming a key factor in all future websitedevelopments.7. ACKNOWLEDGMENTSThe research could not have been completed without the supportand assistance of the College Information Systems Team.8. REFERENCES[1] Benbunan-Fich, R. 2001. Using protocol analysis to evaluatethe usability of commercial websites. Information andManagement, 39, 151-163.[2] Bernard, P.J., Hammond, N.V., Morton, J., & Long, J.B.1981. Consistency and compatibility in human-computerdialogue. International Journal of Man-Machines Studies,1(1), 87-134.[3] Cooper, A., Reimann, R. and Cronin, D. 2007. About Face3: The Essentials of Interaction Design. John Wiley & Sons,Indianapolis, IN[4] Dix, A., Finlay, J., Abowd, G. and Beale, R. 1998. HumanComputer Interaction (second edition). Prentice Hall, UpperSaddle River, NJ.[5] Ebrlich, K., Butler, M.B. and Pernice, K. 1994. Getting thewhole team into usability testing. IEE Software. January1994, 89-91.[6] Fadeyev, D. 2008, 10 Useful Techniques to Improve YourUser Interface Designs. Available online:http://uxdesign.smashingmagazine.com/2008/12/15/10-useful-techniques-to-improve-your-user-interface-designs/(accessed 11 January 2014).[7] Garrido, A., Rossi, G. and Distante, D. 2011. Refactoring forUsability in Web Applications IEEE Software. 60-67.[8] Hasan, L., Morris, A. and Probets, S. 2013, E-commercewebsites for developing countries – a usability evaluationframework. Online Information Review. Vol 37 No 2, 231-251[9] Herzberg, F., Mausner, B. and Snyderman, B.B. 1959. Themotivation to work. New York, NY: John Wiley and Sons.[10] Hom, J. 1998. The Usability Methods Toolbox, availableonline: http://jthom.best.vwh.net/usability/usable.htm(accessed 28 December 2012).[11] Hwang, W. and Salvendy, G. 2010. Number of PeopleRequired for Usability Evaluation: The 10±2 Rule.Communications of the ACM. May 2010, Vol 53, No5. DOI=http://doi.acm.org/10.1145/1735223.1735255.[12] ISO 9241-11, 1998, Ergonomic requirements for office workwith visual display terminals (VDTs), P11: Guidance onusability.Page 8CO4754[13] Ivory, M.Y and Hearst, M.A. 2001. The State of the Art inAutomating Usability Evaluation of User Interfaces. ACMComputing Surveys. Vol 33, No 4, 470-516.[14] Kincl, T. and Strach, P. 2012. Measuring website quality:asymmetric effect of user satisfaction. Behaviour &Information Technology. Vol 31, No 7, 647-657.[15] Lindgaard, G. 2007 Aesthetics, visual appeal, usability anduser satisfaction: what do the user’s eyes tell the user’sbrain? Australian Journal of Emerging Technologies andSociety, 5 (1), 1-14.[16] Masemola, S.S. and de Villiers, M.R. 2006. Towards aFramework for Usability Testing of Interactive E-LearningApplications in Cognitive Domains, Illustrated by a CaseStudent. In J. Bishop and D. Kourie, Service-OrientedSoftware and Systems. Proceedings of the 2006 annualresearch conference of the South African Institute ofComputer Scientists and Information Technologists(SAICSIT) 2006:187-197.[17] Nielsen, J. 1994. Usability Inspection Methods. Wiley.[18] Nielsen, J. 2003. Usability 101: introduction to usability.available at: http://www.nngroup.com/articles/usability-101-introduction-to-usability/ (accessed 11 January 2014).[19] Nielsen, J. and Molich, R. 1990, HE of user interface. In CHI’90 Conference Proceedings. ACM, 249-256.[20] Nørgaard, M. and Hornbæk, K. 2009. Exploring the Value ofUsability Feedback Formats. International Journal ofHuman–ComputerInteraction. 25(1), 49-74.[21] Prom, C. 2007. Understanding on-line archival use throughweb analytics, ICA-SUV Seminar, Dundee, available at:www.library.uiuc.edu/archives/workpap/PromSUV2007.pdf(accessed 11 January 2014).[22] Rosson, M.B. and Carroll, J.M. 2001. Usability engineering:scenario-bsaed development of human-computer interaction.San Franciso, CA: Morgan Kaufman.[23] Tan, W., Dahai, L. ad Bishu, R. 2009. Web evaluation:Heuristic evaluation vs user testing. International Journal ofIndustrial Ergonomics. Vol 39. 621-627.[24] Tractinsky, N. and Zmiri, D. 2006. Exploring attributes ofskins as potential antecedents of emotion in HCI. In:P.Dishwick, ed. Aesthetic computing. Cambridge: MIT Press,405-422.

QUALITY: 100% ORIGINAL – NO PLAGIARISM.

  • **REMEMBER TO PRECISE PAGE NUMBER**
  • Hit The Order Button To Order A **Custom Paper**

>> 100% ORIGINAL PAPERS FROM AustralianExpertWriters.com <<