And We Award the Inaugural TOCHI Best Paper Award, 2016, to…

 


 

We interrupt your regularly scheduled blog posts for a special message from TOCHI’s Editor-in-Chief, Ken Hinckley, who shares with us some breaking news…


 

Banner for ACM Transactions on Computer-Human Interaction Best Paper Award, 2016

And yesthank you, because indeed a very special moment in the history of the TOCHI journal has arrived, and herein we unveil the inaugural…

 

ACM Transactions on Computer-Human Interaction

Best Paper Award

— 2016 —  

So sit back, grab some popcorn (and perhaps a beverage of your choosing), and enjoy the festivities.

 

With the Full Red Carpet Treatment. Of course, on such a celebratory occasion, we must roll out a luxuriant red promenade.

For a fleeting moment, we even considered a military parade for this inaugural occasion. But budgets being what they are, the best we could afford turned out to be a brigade of “Reviewer 2’s” armed with sharp red pens. To be brutally honest we feared this would not go over well, to say the least, and so all such plans were scrapped forthwith.

And with the reality of the publishing industry (as of early 2017) being what it is, our “red” carpet, I am afraid, must be printed solely in black and white.

Furthermore, rather than a plush walkway, the substrate upon which we must strut our stuff is much more akin to recycled newsprint.

But what a venue it is!

Okay, enough fun for now.

So let me set the stage for the award, and in so doing, switch to what my wife calls, my serious voice…


Because it takes incredibly hard work to get into TOCHI, and many notable HCI researchers have published their work in our pages. Even more important, I think, is the wave of up-and-comers in the field who are constantly breaking new ground. We are honored to have played a small role in building their careers, and publication credentials, as well.

TOCHI plays a vital role in the HCI community because it offers a forum for results that sprawl beyond the tidy boxes, tied up with neat satin bows, that can sometimes come to dominate typical conference papers. I’ve certainly written my fair share of those (only without the neatness, and often with some loose ends in those bows as well…). And of course there is nothing wrong with the “typical” conference-paper type of contribution, but by the same token it’s really important that the field has venues for results that are “out of the box” in a sense—and indeed, that span multiple boxes in the form of cross-discipline work, as well.

In that regard, the article we’ve selected for our 2016 Best Paper award is a great representative of the field. It reports on an interdisciplinary effort that advances the needs of a particular user community, but in so doing pushes on boundaries of interaction design and computer science as well. In order to build the system the authors embarked upon, the research had to upend some conventional wisdom regarding image navigation and innovate new interaction techniques along the way.

So (drum roll please), without further ado…

 

The recipient of the 2016 TOCHI Best Paper Award is:

 

The Design and Evaluation of

Interfaces for Navigating Gigapixel Images

in Digital Pathology

 

Roy A. Ruddle             School of Computing, University of Leeds, Leeds, UK

Rhys G. Thomas          School of Computing, University of Leeds, Leeds, UK

Rebecca Randell         School of Healthcare, University of Leeds, UK

Philip Quirke               Leeds Institute of Cancer and Pathology, University of Leeds, UK

Darren Treanor           St James’ University Hospital, Leeds, UK, and
Leeds Institute of Cancer and Pathology, University of Leeds, UK

 

ACM Transactions on Computer-Human Interaction
Volume 23, No. 1, Article 5 (February 2015): 29 pages.
DOI= http://dx.doi.org/10.1145/2834117

For this fine accomplishment, each of the authors will receive a physical manifestation of the award, which looks something like the following:

 Plaque for the inaugural TOCHI Best Paper Award, 2016

And just to pique your interest in this fine work just a bit further, the following abstract characterizes the work in the authors’ own words:

 This article describes the design and evaluation of two generations of an interface for navigating datasets of gigapixel images that pathologists use to diagnose cancer.

The interface design is innovative because users panned with an overview:detail view scale difference that was up to 57 times larger than established guidelines, and 1 million pixel “thumbnail” overviews that leveraged the real estate of high-resolution workstation displays.

The research involved experts performing real work (pathologists diagnosing cancer), using datasets that were up to 3,150 times larger than those used in previous studies that involved navigating images. The evaluation provides evidence about the effectiveness of the interfaces and characterizes how experts navigate gigapixel images when performing real work. Similar interfaces could be adopted in applications that use other types of high-resolution images (e.g., remote sensing or high-throughput microscopy).

Check it out. You’ll be glad you did. By the time you read this, the article should be available in the ACM Digital Library for open-access—sporting a spiffy new award badge no less—at:

http://dx.doi.org/10.1145/2834117

Announcing Best Paper Awards for TOCHI

I’ve contributed to TOCHI—as a reviewer, author, Associate Editor, and now Editor-in-Chief—for almost 20 years.

And although I recognize that not everyone is totally comfortable with the idea of best paper awards for journals, I think it’s high time that we also acknowledge that TOCHI publishes a lot of truly excellent work.

When I assumed the helm of TOCHI, I embarked on a quick survey of the other Transactions-level journals. I also studied a number of status reports authored by their various Editors-in-Chief.

Several Transactions already have established “best paper” (and other) awards.

Most of our leading SIGCHI conferences have had best paper awards for many years now.

And the ACM Publications Board already has a well-documented process in place for establishing Best Paper Awards for its journals.

I therefore decided that it was time.

And submitted a formal proposal.

Which has now been approved.

As a mark of distinction at an already highly selective journal, the annual ACM Transactions on Computer-Human Interaction Best Paper Award will bring considerable prestige to the authors thus distinguished.

The purpose of the award is to recognize and bring greater attention to the excellence of top papers published in TOCHI. This helps build the careers of our authors as well as the stature and desirability of publication in the TOCHI journal itself.

In addition to giving authors further incentive to submit their best work, this affords carry-over benefits to publicity, downloads, and citations to the journal, thereby enhancing the influence of the award winners as well as the impact factor all of the excellent papers that we publish.

We will play no favorites here, and in fact the formal proposal bars the Editor-in-Chief from consideration. The sole criteria shall be the overall merit of the work, in terms of technical excellence, significance to the research community, impact, clarity of presentation, and scope of the contribution—among many other criteria of outstanding research.

You can find all the details, and the formal documentation of the rules governing the selection committee, at:

http://tochi.acm.org/awards

Okay.

There you have it.

We are thrilled to start recognizing our very best.

I really hope that you will submit us your best work, to keep our pipeline bubbling with great contributions from our vibrant community.

And that you will also take a moment to nominate impactful TOCHI papers as they come to your attention over the course of the year; we accept nominations from anyone in the community. A brief statement of why you think the paper should be considered for an award is appreciated but not required.

With the New Year upon us, I am really looking forward to 2016.

And I hope that you are too.