CHIRA is sponsored by INSTICC – Institute for Systems and Technologies of Information, Control and Communication

SCOPE

The purpose of the International Conference on Computer-Human Interaction Research and Applications (CHIRA) is to bring together professionals, academics and students who are interested in the advancement of research and practical applications of human-technology & human-computer interaction. Five parallel tracks will be held, covering different aspects of Computer-Human Interaction, including Interaction Design, Human Factors, Entertainment, Cognition, Perception, User-Friendly Software and Systems, Pervasive Technologies and Interactive Devices.

Papers describing original work on advanced methods, prototypes, systems, tools and techniques as well as general survey papers indicating future directions are encouraged. Accepted papers will be presented at the conference by one of the authors and published in the Proceedings of CHIRA, which will be placed on at least one Digital Library and sent for indexation by the major international indexes.

CONFERENCE AREAS

Each of these topic areas is expanded below but the sub-topics list is not exhaustive. Papers may address one or more of the listed sub-topics, although authors should not feel limited by them. Unlisted but related sub-topics are also acceptable, provided they fit in one of the following main topic areas:

[Note the many references to types of presence in this story from techdigg about an app that immerses users in historical events. See the Kickstarter page for much more information, including videos, and to donate. –Matthew]

Timebound: New App Travels Back in Time to Relive History as It Happened

Some ideas are so brilliant, yet so simple, that it’s almost frustrating to learn about them. “Why didn’t I think of that?” is usually the first thought.

The new app Timebound, estimated to be available for iOS and Android as soon as May 2017, has that effect because of its very simple, but smart use of push-notification technology on mobile devices. It leverages that simple technology for all it’s worth, then combines it with superb research and writing to make something special.

It will be an especially desirable app for history buffs who love reliving and immersing themselves in historical events.

Of course, there’s the education angle too: once the word gets around about this new app, tech-savvy history teachers around the world will likely make it a mandatory download for their students.

History Brought to Life: What Timebound Does (Besides Almost Achieving Time Travel)

Timebound essentially places you in the real-time flow of history as it happened by notifying you of every twist and turn of an event through push-notifications. It uses the exact times and sequences of important developments as they happened.

You suddenly find yourself standing in the shoes of those who lived through it.

And it does it in an engaging, exciting way using good writing and storytelling–at least, that is the goal of the Timebound team. They also have made immersive maps and other bonus resources that give it a feature-rich experience.

In the company’s own words:

“Timebound is an app for learning about the past in an easy and exciting way. It allows you to follow important historical events hour-by-hour and minute-by-minute. You can join the Titanic on her maiden voyage, witness the hunt for Jack the Ripper, see the first landing on the Moon, experience the first Woodstock festival, and dozens of other thrilling stories.”Read more on Timebound app takes users back in time to relive history as it happened…

We invite original contributions on all topics related to Human-Computer Interaction, Interaction Design and the design of interactive technologies. Submissions are invited for long papers, short papers, work-in-progress papers, workshops, demonstrations, the Doctoral Consortium and the Student Design Challenge.

OzCHI is the annual non-profit conference for the Computer-Human Interaction Special Interest Group (CHISIG) of the Human Factors and Ergonomic Society of Australia. OzCHI is Australia’s leading forum for the latest in HCI research and practice. OzCHI attracts a broad international community of researchers, industry practitioners, academics and students. Participants typically come from more than 25 countries and from a range of backgrounds, including interface designers, user experience experts, information architects, software engineers, human factors specialists, information systems analysts and social scientists. We welcome international submissions and participation. We encourage submissions and participation from industry.

The theme of the conference this year is Human-Nature. Our theme highlights both the need for socio-technical systems which bring out our better human nature, and the need to better engage people with the natural environment in which we live in order that we understand, appreciate and learn to live in harmony with nature and the wonders it holds.

[Echoing the excellent series Humans, the latest advanced sex doll includes a broader set of AI-based behaviors. The first story below, from LADbible, describes Sergi Santos’ creation; the second story, from MIC, considers some of the many ethical issues it (she?) raises. See the Synthiea Amatus website for much more information including images and videos. –Matthew]

ICIDS has its origin in a series of related international conferences that ran between 2001 and 2007. Since 2008, ICIDS became the premier annual venue that gathers researchers, developers, practitioners and theorists to present and share the latest innovations, insights and techniques in the expanding field of interactive storytelling and the technologies that support it. The field re-groups a highly dynamic and interdisciplinary community, in which narrative studies, computer science, interactive and immersive technologies, the arts, and creativity converge to develop new expressive forms in a myriad of domains that include artistic projects, interactive documentaries, cinematic games, serious games, assistive technologies, edutainment, pedagogy, museum science, advertising and entertainment, to mention a few. The conference has a long-standing tradition of bringing together academia, industry, designers, developers and artists into an interdisciplinary dialogue through a mix of keynote lectures, long and short article presentations, posters, workshops, and very lively demo sessions. Authors should clarify for which category to which they are submitting.

ICIDS welcomes contributions from a large range of fields and disciplines related to interactive storytelling. We encourage original contributions in the forms of research papers, position papers, posters and demonstrations, presenting new scientific results, innovative theories, novel technological implementations, case studies and creative artistic projects in the field of Interactive Digital Storytelling and its possible applications in different domains. We particularly welcome research on topics in the following five areas: Read more on Call: ICIDS 2017: 10th International Conference on Interactive Digital Storytelling…

[This is a good example of the movement toward presence-evoking formats in media campaigns, in this case for fire safety. The story is from StopPress, where it includes two videos and more images. More information and pictures are available in a story from NZ TechBlog, which includes this additional detail: “People are prompted to enter their address when commencing the Escape My House VR activity. Once they reach the end and have made it out of the house safely onto the street, they will be confronted with an image of their own home burning down in front of them, pulled in via Google StreetView.” Finally, firefighters talk about the VR experience in a 1:56 video available from Stuff. –Matthew]

UPDATE: FCB has reported that since it’s launch on 22 March, the online tool has been experienced by over 120,000 users and the video on Facebook has been viewed more than 10 million times and shared by more than 86,000 people. It has gone beyond New Zealand and has also been accessed by Australia, Austria, Canada, France, Taiwan and the United States.

It’s winter, it’s cold, and the laundry’s still wet, so you move it a little closer to the fireplace, just enough to get your clothes toasty warm in no time. You leave the room for the briefest of moments, and the next thing you know, you’re engulfed in a sea of flames. One minute the ceiling’s 30 degrees celsius; the next, it’s skyrocketed to a whopping 350. Your senses go into overdrive as fire crackles in your ear and smoke billows before your eyes. As panic swiftly sets in, what should you do?

Thankfully, you can relax, because the scenario is a fictional one that’s the basis for NZ Fire Service’s latest campaign, Escape My House. Through virtual reality technology and 360-degree video, the experience explores the shocking realities of escaping a house fire and includes features that demonstrate the emotional and psychological barriers that those caught in the event are often faced with. As users navigate the house, they’re encouraged to interact with the items they encounter (a games console, a photo album, a child’s soft toy) and try out various exits, several of which turn out to be inaccessible. A clock and two thermometers are displayed on the screen as well, ticking furiously upwards to stimulate a tangible sense of urgency, dread, fear and panic.

[I highly recommend watching the 24 second viral video available in this short Huffington Post story. For more internet reactions see coverage in NDTV. And for more items like this, please join us in the (free and public) ISPR Presence Community Facebook group. –Matthew]

[The technology described in this widely cited story from New Scientist moves us closer to sharing richer and more diverse sensory experiences over time and distance; the original story includes a 0:35 minute video and a second 1:08 minute video is available on YouTube. See Nimesha Ranasinghe’s website for more information, including the earlier Digital Lollipop project. –Matthew]

A system of sensors and electrodes can digitally transmit the basic colour and sourness of a glass of lemonade to a tumbler of water, making it look and taste like a different drink. The idea is to let people share sensory experiences over the internet.

The 1st Workshop on Games-Human Interaction (GHItaly 2017) aims at bringing together scholars and industry practitioners to establish a common ground on the topic. Main goal of the event is to spur discussion, exchange of ideas, and development of new ways of researching, teaching, and working on HCI applied to design and production of video games. The application range of video games has to be intended in its broadest sense: both entertainment and applied finalities.

GHItaly 2017 will be held in conjunction with the 12th Edition of CHItaly, the biannual Conference of the Italian ACM SIGCHI Chapter, from 18th to 20th of September 2017 in Cagliari.

TOPICS

The workshop aims at collecting contribution advancing the research applied to video games. Suggested topics include (but are not limited to):

[Here’s a good example of the evolution of presence-evoking technology and (though it’s not discussed in this story from idealog), the obvious ethical issues it raises. The idealog story includes more pictures and two videos; MIT Technology Review’s coverage includes a 1:42 minute video of Nadia; and more information is available from the Soul Machines website. For a related story that raises some of the ethical issues, see “Can Alexa lie?” from Shelly Palmer’s blog. –Matthew]

A Kiwi company at the forefront of humanising AI technology has revealed its first virtual assistant called ‘Nadia’, which has been voiced by actress Cate Blanchett.

Soul Machines is an Auckland-based company that develops intelligent, emotionally responsive avatars that improve the user experience on artificial intelligence platforms.

After receiving a $7.5 million investment from Hong Kong-based artificial intelligence and virtual reality investor, Horizon Ventures, the company formally launched in November last year.

Its technology is based on ‘Baby X’, a creation by Mark Sagar and his engineering research team at the Auckland Bioengineering Institute at the University of Auckland. Sagar is the CEO of Soul Machines.

Baby X is an emotionally intelligent virtual agent – and a digital simulation of an infant – which reacts to the person it’s communicating with via facial expressions and words in real time.

The technology is incredibly realistic, albeit quite disconcerting to some – as demonstrated by one comment on the Baby X Vimeo: “I think that as soon as the baby turned into a floating brain is when I realised that night-terrors would be in my near future.”

“Just about everybody who sees it is absolutely amazed that it’s actually a digital creation and not a video image of a baby,” Soul Machines chief business officer Greg Cross says.

Now, after a process of R&D of about five years, Nadia is the first commercial project to be launched with Baby X’s technology.

It was developed for the NDIS (National Disability Insurance Scheme) in Australia using IBM Watson’s artificial intelligence technology as a cognitive back-end and FaceMe, an Auckland-based real-time video communication company.