Welcome to Providence College Blogs

Providence College has joined the Rhode Island Statewide Open Textbook Initiative. Launched in September 2016 the goal of the initiative is to reduce college costs by saving students $5 million over five years using openly licensed textbooks and open educational resources (OERs). In addition to PC, current participating institutions include: Rhode Island College, the University of Rhode Island, the Community College of Rhode Island, Brown University, Bryant University, Roger Williams University, and the New England Institute of Technology.

Here at PC work has begun raise awareness of OERs and open textbooks on campus. Representatives from the Library teamed up with Assistant Professor of Economics, James Campbell, to provide an introduction to open textbooks at the Center for Teaching Excellence on November 1st. The presentation covered the basics of OER, information on locating open materials, and open licensing with Creative Commons. Campbell is using an OpenStax textbook to teach several sections of Microeconomics this semester. His insights on the experience were extremely valuable. You can view the slides from the talk here.

Through generous support from the Provost’s Office the Center for Teaching Excellence and the Library are collaborating to administer a series of small mini-grants to support course design and revision projects that prioritize open educational resources (OERs). Awardees will be selected this month. Over the spring semester mini-grant recipients will work closely with the Library to incorporate open content into their syllabi for adoption in Fall 2017.

A Steering Committee made up of library representatives from participating institutions will be responsible for implementation of RI Open Textbook Initiative. Members of the Steering Committee will communicate with the Open Textbook Network (URI, RIC, and CCRI are now member organizations), provide training opportunities around OERs for librarians around the state, and develop instruments for documenting and reporting student savings resulting from the initiative.

The Library’s Digital Publishing Services has been engaged with work around OERs for some time. We are thrilled about these new opportunities to collaborate with PC faculty around OERs. For further reading on this subject check out some of our previous here, here and here.

Providence College has joined the Rhode Island Statewide Open Textbook Initiative. Launched in September 2016 the goal of the initiative... MORE

Recently, JSTOR, a digital library of academic journals, books, and primary sources and part of ITHAKA, a not-for-profit organization that also includes Ithaka S+R and Portico announced a new program to make Open Access monographs available on the JSTOR platform. An initial 63 titles from four academic presses (University of California Press, University of Michigan Press, UCL Press, and Cornell University Press) are currently available.

“The introduction of this Open Access program is part of our ongoing efforts to expand discovery, access, and use of scholarly materials,” noted Frank Smith, Books at JSTOR Director. “We look forward to sharing what we learn with the scholarly communications community.”

Recently, JSTOR, a digital library of academic journals, books, and primary sources and part of ITHAKA, a not-for-profit organization that... MORE

On Tuesday, November 1st, the Library of Congress unveiled the new redesign of their website homepage. The unveiling comes as part of the larger redesign of their site, currently in the works. The Library’s blog, The Signal, recently published an interview (conducted by Jaime Mears) with Natalie Buda Smith, User (UX) Team supervisor for the Library of Congress, where she discussed user experience (UX) and the importance of design focus in libraries.

Project One is the name of the Library’s redesign initiative, led by Smith. One of Project One’s biggest challenges, says Smith, is that the Library started sharing their (vast amount of) content early on the web, using older technologies, and a substantial amount of “re-work” is necessary to integrate the old content with new technologies. Also challenging has been the task of conceptualizing a framework for the site that is optimized for search; decisions need to be made about which objects need metadata and appropriate metadata needs to be assigned to items. Once that foundation is laid, the team aims to build structures for packaging the content in different ways to appeal to certain audiences.

For more on the design process and to view the interview with Natalie Buda Smith, please visit the post on The Signal‘s site here. To view the Library of Congress’s new homepage, please visit loc.gov.

On Tuesday, November 1st, the Library of Congress unveiled the new redesign of their website homepage. The unveiling comes as... MORE

This past September was the 350th anniversary of the Great Fire of London, and the Museum of London has augmented its commemorative “Fire! Fire!” exhibit with a Minecraft map in which players explore the city and fight the fire as it occurs. (NYT article here – one Youtube video of gameplay can be watched here.) One stated goal of using games to convey historical information is to attract and engage children and non-traditional museum patrons — but it’s also interesting to think about ways in which the game might provide a new learning experience even for people with a more conventional history background. For instance, you might read in a book or article that the spread of the fire is partly attributed to the Mayor’s delay in ordering the destruction of houses to create firebreaks — but you could also, as in the gameplay video linked above, run a long way through confusing, similar-looking burning streets to find the Mayor and bring him to the site where the fire started, because your objective as a player is to get him to give the order, and then feel the frustration when he refuses! (Empathy is a subject that comes up in discussion of history-based and history education video games.)

Another video game-related map is Pudding Lane Productions’s 2013 Cryengine map of the area where the fire began, which won the “Off the Map” competition for developing 3D video game scenery based on maps from the British Library. The developers’ discussion of their process reveals some of the challenges that also face scholars working with historical documents. Using the maps as their source, they were able to lay out the streets and the footprints of the buildings, but found that the resulting model was not cramped enough and lacked vitality. Revisions increased the overhang of buildings’ upper stories into the streets, as well as adding crates, carts, vendors’ stalls, wares hung outside shops, washing lines, and other “props” that wouldn’t have made it onto maps, but that were nonetheless a part of London and people’s experience of life in the city. Additionally, they added as many real attested businesses as possible, using historical sources like Samuel Pepys’s diary; this lends the map a great deal of accuracy, but also highlights the gaps in our knowledge of day-to-day life, since most of the houses and businesses on the map did simply have to be generic and modular.

Interestingly, the Pudding Lane developers also mention that “[o]ne key issue caused by following the source material so closely was that a lot of seventeenth-century London looked very similar”. They addressed this by using different palettes in different areas. (This map doesn’t have any people on it, but if it had, perhaps the difference in areas would be established by populating them with different kinds of non-player characters going about their business.) This issue is very prominent in the less-sophisticated Minecraft map as well, but in that game it might be a feature instead of a bug.

This past September was the 350th anniversary of the Great Fire of London, and the Museum of London has augmented... MORE

A film by web activist Brett Gaylor and musician Greg Gillis, aka Girl Talk. This is a compelling and fun movie about the history of copyright and its implications on creativity. It draws upon the work of Girl Talk, and the filmmaker himself, as some of the examples of the complexities surrounding copyright law in regards to sampling music, film, etc., and using other artists creativity as a stepping stone for their own work.

The Internet’s Own Boy follows the story of programming prodigy and information activist Aaron Swartz. From Swartz’s help in the development of the basic internet protocol RSS to his co-founding of Reddit, his fingerprints are all over the internet. But it was Swartz’s groundbreaking work in social justice and political organizing combined with his aggressive approach to information access that ensnared him in a two-year legal nightmare. It was a battle that ended with the taking of his own life at the age of 26. Aaron’s story touched a nerve with people far beyond the online communities in which he was a celebrity.

Cooper Hewitt, Smithsonian Design Museum has digitized and released more than 200,000 objects, and as you might expect from a prominent design museum, the collection is presented in a sharp and engaging interface. They’ve included extensive metadata for each object, which allows for an engrossing browsing experience. You can search and filter by a variety of facets, including color, size, and image complexity (beta). Each object also has a visual timeline of its life in the collection, from acquisition to digitization.

The site also includes an Experimental section with a few features that you can play with, including “Albers boxes”, an homage to the Bauhaus color-theorist:

“We show Albers boxes when an image can’t be found or when an image has not yet been digitized using the concentric squares as a device to convey some of the information about the object. The outer ring of an Albers box represents the department that an object belongs to; the middle ring represents the period that an object is a part of; the inner ring denotes the type of object […] We are trying to imagine a visual language that a person can become familiar with, over time, and use a way to quickly scan a result set and gain some understanding in the absence of an image of the object itself.”

What’s maybe most impressive is that the collection was digitized in 18 months. For a glimpse behind the scenes, check out this video from Cooper Hewitt. And if you like to geek out in the weeds of things like project management and data mapping, you’ll want to check out Cooper Hewitt Labs, where Allison Hale is in the middle of a 4-part series of in-depth posts on the mass digitization, beginning with Workflows and Barcodes and Digital Asset Management.

Cooper Hewitt, Smithsonian Design Museum has digitized and released more than 200,000 objects, and as you might expect from a... MORE

Palladio is a research tool for examining data across time and space. It allows for the identification of patterns, clusters, and trends within data that may be difficult for an individual researcher interacting with the data to see. Palladio serves as a means of enhancing (not replacing) traditional qualitative humanities research methods. Data can be mapped, graphed to show network relationships, viewed and faceted as an interactive gallery, and more. Palladio comes out of Stanford University’s Humanities + Design research lab.

I’m enrolled in an Introduction to Digital Humanities course through Library Juice Academy. One of my assignments this week requires an examination of Palladio (as well as a similar tool, Google Fusion Tables). Palladio peaked my interest. My initial introduction and interaction with Palladio came through the very helpful Getting Started With Palladio tutorial by Miriam Posner. This tutorial provides clear, easy to follow instructions for uploading data into Palladio and beginning to work with the data tools- definitely check it out.

After completing the Posner’s tutorial I got inspired to apply Palladio to some data we have access to through DPS projects. I took a few minutes to aggregate data from a couple of different spreadsheets around the Dorr Letters Project. My data looks like this:

In less than a minute I was able to create this visualization graphing the “to” and “from” fields:

And this map showing the origination location for each item of correspondence:

I’ll continue to play with Palladio and update this post accordingly.

Palladio is a research tool for examining data across time and space. It allows for the identification of patterns, clusters,... MORE

Most higher education faculty are unaware of open educational resources (OER) – but they are interested and some are willing to give it a try. Survey results, using responses of over 3,000 U.S. faculty, show that OER is not a driving force in the selection of materials – with the most significant barrier being the effort required to find and evaluate such materials. Use of open resources is low overall, but somewhat higher for large enrollment introductory-level courses.

Selecting Teaching Resources:

Almost all (90%) of teaching faculty selected new or revised educational materials for at least one course over the previous two years.

The most common activity was changing required materials for an existing course (74%), followed by substantially modifying a course (65%). Creating a new course was the least common activity (48%).

The most common factor cited by faculty when selecting educational resources was the cost to the students. After cost, the next most common was the comprehensiveness of the resource, followed by how easy it was to find.

There is a serious disconnect between how many faculty include a factor in selecting educational resources and how satisfied they are with the state of that factor. For example, faculty are least satisfied with the cost of textbooks, yet that is the most commonly listed factor for resource selections.

Required Textbooks:

Virtually all courses (98%) require a textbook or other non-textbook material as part of their suite of required resources.

Required textbooks are more likely to be in printed form (69%) than digital. Faculty require digital textbooks in conjunction with a printed textbook more often than using only digital textbooks.

Only 5.3% of courses are using an openly licensed (Creative Commons or public domain) required textbook.

For large enrollment introductory undergraduate courses openly licensed OpenStax College textbooks are adopted at twice the rate (10%) as open licensed textbooks among all courses.

Licensing:

There has been very little change in the past year in the proportion of faculty who report that they are aware of copyright status of classroom content.

Awareness of public domain licensing and Creative Commons licensing has remained steady.

Faculty continue to have a much greater level of awareness of the type of licensing often used for OER (Creative Commons) than they do of OER itself, and it is clear that they do not always associate this licensing with OER.

Open Educational Resources:

Faculty awareness of OER has increased in the last year, but remains low. Only 6.6% of faculty reported that they were “Very aware” of open educational resources, with around three times that many (19%) saying that they were “Aware”.

The level of faculty awareness of open textbooks (a specific type of OER) was somewhat lower than that for open educational resources; only 34% of faculty claimed some level of awareness.

Barriers to OER Adoption:

The barriers to adopting OER most often cited by faculty are that “there are not enough resources for my subject” (49%), it is “too hard to find what I need” (48%) and “there is no comprehensive catalog of resources” (45%).

There has been a decrease in faculty concerns about permission to use or change OER materials, and increases in concerns about the quality of OER and that it is timely and up-to-date.

Most faculty do not have experience searching for OER materials and cannot compare the ease of finding OER with traditional materials. Only 2.5% thought that it was easier to search for OER.

Future:

The number of faculty claiming that they would use OER in the future (6.9%) is of the same order of magnitude of those already using open resources (5.3%). A larger group (31.3%) reports that they will consider future OER use.

A research report was released by the Babson Survey Research Group on July 26, 2016: “Opening the Textbook: Educational Resources in... MORE

On September 27th, the Library of Congress hosted a conference called Collections as Data in Washington, D.C. The conference website provides the following description for the event:

“The rise of accessible digital collections coupled with the development of tools for processing and analyzing data has enabled researchers to create new models of scholarship and inquiry. The National Digital Initiatives team invites leaders and experts from organizations that are collecting, preserving and providing researcher access to digital collections as data to share best practices and lessons learned. This event will also highlight new collaborative initiatives at the Library of Congress that seek to enhance researcher engagement and the use of digital collections as data.”

Participants had the option of attending in-person or virtually, as the event was live-streamed on the Library of Congress YouTube channel; members of the Digital Publishing Services team attended sessions virtually throughout the day. Sessions were open to the public and organizers asked that attendees use the hashtag #AsData in their tags. A video recording of the conference has been archived on the LOC YouTube channel. For more information about the event, please visit the conference website. (Source)

On September 27th, the Library of Congress hosted a conference called Collections as Data in Washington, D.C. The conference website... MORE

Leah Grandy, of the Loyalist Collection at the University of New Brunswick Libraries, has a few recent(-ish) posts on paleography, or deciphering historical handwriting. Grandy notes here that paleography training—previously thought to be necessary only for people studying medieval and early modern texts, which may be written in styles such as blackletter or secretary hand that they don’t necessarily encounter much in their modern lives—may also need to be extended to students and researchers of later centuries as well. Cursive, previously a staple of early education, is no longer taught in many schools, and as a result, undergrads are arriving at college who have trouble reading 18th-20th century handwritten primary sources. As someone who has deciphered written annotations for the Women Writers Project and sometimes transcribes documents on Shakespeare’s World for fun, I’m used to people recoiling in fear and/or disgust at the idea of facing down secretary hand, but it’s strange for me to think about people having a similar reaction to cursive!

In this post, Grandy offers a really helpful set of tips for reading or transcribing handwritten documents—whatever style they’re written in. Among them: comparing unclear letters/words to identifiable ones; looking up people and places; transcribing what you can identify and leaving blanks before coming back; guessing and going with your gut! If you’re a student or researcher dealing with handwritten primary sources, check it out.

Leah Grandy, of the Loyalist Collection at the University of New Brunswick Libraries, has a few recent(-ish) posts on paleography,... MORE