Research Blogging
23 posts tagged with "Research Blogging"
- Information Sediment and Data Particles
Sediment Current thought conceptualises broad1 data as one layer of data overlaying another usually onto some base-data; the base-data being useful and referable to the user. These base-data may be absolute geographic coordinates, relative artefact coordinates – being eye-tracked say, or navigation pathways through websites. The key feature is that the base-data is a common dimension among the datasets which will be layered, and are relevant to the area of inquiry.
- Analysing the visual complexity of web pages using document structure
Briefly, counting the number of visible edges on a web page is a good indicator of the perceived complexity of a web page. There is a lot more to it than this obviously… Ranking Scores from Manual and Computational Algorithm - Line The perception of the visual complexity of World Wide Web (Web) pages is a topic of significant interest. Previous work has examined the relationship between complexity and various aspects of presentation, including font styles, colours and images, but automatically quantifying this dimension of a web page at the level of the document remains a challenge.
- ACM ASSETS 2012: The Best Paper According to Me!
Well I’m finally back from ASSETS and the jet lag is disappearing. While ASSETS allocates the official Best Paper prizes most years I disagree. This year is no exception and for me the best paper - and science at ASSETS 2012 was ‘Evaluation of dynamic image pre-compensation for computer users with severe refractive error’ Armando Barreto This paper was authored by Jian Huang, Armando Barreto, and Malek Adjouadi and was presented by Armando Barreto.
- Adapting Interfaces to Suit Our Senses
A few weeks ago I gave a presentation at the Instituto Superior Técnico (IST) at the Technical University of Lisbon (UTL) in Lisbon. It was about Deep Accessibility and how we should think about adapting interfaces to suit our senses. Here are my slides and two entries in the style of a Tiny Transactions on Computer Science (TinyToCS); “the premier venue for computer science research of 140 characters or less”.
- The Uptake of Web 2.0 Technologies, and its Impact on Visually Disabled Users
Our analysis shows that for the most popular 500 sites, JavaScript is used in 93%, Flash in 27% and about one-third (30%) use XMLHttpRequest, a technology used to generate dynamic updates. Uptake of XMLHttpRequest is approximately 2.3% per year across a random selection of 500 sites and is probably higher in the most popular sites. So, when examining dynamic updates from the perspective of visually disabled users, evidence suggests that, at best, most users can currently reach updated content, but they must do so manually, and are rarely given any automated indication that any update has occurred.
- User Interaction Revived at WWW2012 - www2012
For the last two / three years I’ve been increasingly vocal - and annoyed - at the lack of human factors and user interface work at WWW. I’m glad to say that in 2012 things seem to have changed - lets hope for continued change into 2013 and beyond. Finally, the Web Conference gets a ‘real’ human factors session! After a number of years bereft of user interaction and human factors work - beyond a submissions track - UI work makes a welcomed return to WWW 2012.
- Designing the Star User Interface [UX]
One of my ‘A History of HCI in 15 Papers’* “ The Star system (circa 1980, and as described in Byte**[1]**) gave rise to five principles, which in my opinion, are so important and timeless that their formulation and practical application as part of the Xerox Star user interface was without doubt revolutionary.” The Xerox ‘Star’ was a commercial version of the prototypical Xerox Alto – if one thousand fully working systems, used internally at ‘PARC’ day-in-day-out over seven years, can be said to be prototypical.
- Why Most Published Research Findings are False - Or Are They?
“Of the 49 articles, 45 claimed to have uncovered effective interventions. Thirty-four of these claims had been retested, and 14 of these, or 41 percent, had been convincingly shown to be wrong or significantly exaggerated. If between a third and a half of the most acclaimed research in medicine was proving untrustworthy, the scope and impact of the problem were undeniable.” Well I’m really heartened to see scientific debate progressing as it should do.
- Fitts, and the Amplitude of Movement
“The key aspect of this work is not the extent of the studies - using hundreds of participants for one specific protocol - but the combination of three experimental protocols coupled with small user groups.”One of my ‘A History of HCI in 15 Papers’ Paul M Fitts seminal work “The information capacity of the human motor system in controlling the amplitude of movement” **[1]**does exactly what a good human factors paper should.
- The Cocktail Party Problem [accessibility a11y]
“This can only be useful work in the domain of blindness, situation impairment, and accessibility in that it may be possible to convey limited Web page information spatially, dynamically, and with a high degree of comprehension at seven (or nine) times faster because of the ability to comprehend highly parallel speech.”One of my ‘A History of HCI in 15 Papers’ It is unlikely that Colin Cherry1 realised the significance the community would place on the small five page paper he sent to the Acoustic Society of America in 1953 [1].
- A History of HCI in 15 Papers
“How would you describe HCI in just research papers - and indeed, could you do this as a teachable unit?” Paris Observatory Astrolabe Inspired by the recent A History of the World in 100 Objects, it’s a simple idea, describe the last two million years of world history by focusing on 100 objects created in the time period from all over the world. Here’s one you may like: “The astrolabe was highly developed in the Islamic world by 800 and was introduced to Europe from Islamic Spain (Andalusia) in the early 12th century.
- Of Chocolate and Human Factors
My final week discussing Dix 2010 [1] which I covered last week, and the week before that too. Now lets: imagine you have a group of children and want to give them lunch. In the UK you might well choose baked beans. Not the most exciting choice, but few children actively dislike baked beans; they are acceptable to everyone. However, give each of those children a euro (or maybe two) in a sweet shop … they will all come away with a different chocolate bar, the chocolate bar that is ‘OK’ for everyone gets chosen by none.
- Single User Studies Considered Useful
What I hear you cry, “single user studies can’t be valid, even ethnography’s have more than one user”. Well that’s what I was saying before reading Dix 2010 [1] which I covered last week. The critical thing that Dix sees as different is that - and I’m paraphrasing and using my own terms here - single user studies can be used to scope extent as opposed to our normal desire to support a point via a measure of magnitude of similarity across users; as a way of discovering out-layers as opposed to those which look like harmonise sample data; and as a way of disproving the rule which all the other sample data seems to support.
- Authonomy Points the Way to Open Peer Reviewing
Authonomy is a unique online community that connects readers, writers and publishing professionals. It was conceived and built by editors at HarperCollins Publishers. They are in ‘beta’ at the moment, so they’re still developing and perfecting the site. Authonomy invites unpublished and self published authors to post their manuscripts for visitors to read online. Authors create their own personal page on the site to host their project - and must make at least 10,000 words available for the public to read.
- Research Funding - and a Happy New Year 2010
Well first off, let’s say goodbye to 2010 and welcome in 2011 - I’m sure Time Square will be as crowded as it was in the 1950’s - different but the same! Now lets look at research funding [1] - I think this can be equally applied to paper acceptance rates - in the hope of a better funded 2011! Current thought seems to be that 30% is about the right level of acceptance for funding.
- Defining UX - and a Merry Christmas 2010!
As a positivist research scientist I’ve been struggling with the whole User Experience (UX) space for a long time, because to me it just seems a bit - well - ‘fluffy’. Many people seem to have got to grips with it, including Rui Lopes (see his recent blog articles) but to me, the more I read about the subject the more I think it is fine for evaluating specific interfaces but that the results cannot be generalised.
- W4A Paper Deadline is 'Danger Close' - accessibility a11y w4a11
So we are coming very close to the W4A Paper deadline for the 2011 edition, indeed, it is on the 10-Jan-2011. The theme this year is ‘Crowdsourcing the Cloud: An Inclusive Web by All and For All?’. While Crowdsourcing the Cloud is the theme, please don’t be deterred if this somewhat unique area is not yours. The organisers would like to see all quality work on Web Accessibility regardless of the particular field within accessibility.
- Visual Complexity Rankings and Accessibility Metrics - accessibility a11y
Eleni Michailidou passed here PhD defence with flying colours and now her work ‘Visual Complexity Rankings and Accessibility Metrics’ is published. I’ll let her abstract tell the story but this is some really interesting work. The World Wide Web (Web) has become the major means of distribution and use of information by individuals around the world. Web page designers focus on good visual presentation to implicitly help users navigate, understand, and interact with the content.
- ASSETS 2010 Picks - assets10
We did present at ASSETS 2010 as I previously said and I must say that I think this years conference was solid. Maybe the work presented was not completely within my frame of interest; indeed, there was Rehabilitation Engineering, Assistive Technology, Educational, and advocacy work there which are interesting but for me not directly relevant. However, there were a couple of papers that did in principle offer the promise (if not yet realised) of being transformative, and providing some good solid scientific understanding.
- Interface Systems Evaluation & Innovation
I recently came across a paper discussing the evaluation of user interface systems. In it the author proposes that complex user interface systems and architectures do not readily yield to the research methods we currently use. It was at this point I started to bristle with derision in a very defensive “I’m a research scientist and the scientific method says that we must have objective measures to express an accurate understanding!
- Model-Based User Interfaces and the Web
Interesting ideas coming from the Model-Based UI XG W3C Incubator Group with their Final Report of 04 May 2010 proposing model driven approaches to Web interface creation, and Web application Interaction. Within the Web Ergonomics domain I’m particularly interested in the sections on user modelling [1] via the use of an ontology [2]. Now while I think this particular approach shows a fundamental misunderstanding of the ways in which ontologies are used within the semantic Web and Description Logic communities; mainly because they seem to want to try and model an individual user as opposed to a generic type of user.
- Web4All Conference 2010
This years conference focused on Developing Regions wishing to investigate accessibilities Common Goals and Common Problems1. The rationale was that the community thought that a revolution in the information society was starting, based on the use of mobile phones in developing countries. The hyper-growth of mobile phone penetration was deeply changing the lives of people in most of the world; their ways of communicating, working, learning, and structuring their societies.
- That Pesky Number 7
One of my ‘A History of HCI in 15 Papers’ Models of the user have existed in HCI for a number of years. Some of the first where developed by Miller in an attempt to apply information theory to the human. Information theory is a branch of applied mathematics and electrical engineering involving the quantification of information. Historically, information theory was developed by Shannon to find fundamental limits on signal processing operations such as compressing data and on reliably storing and communicating data.