Archive for the ‘Cloud Preservation’ Category

LinkedIn has recently made modifications to their authentication system requiring users to re-authenticate every 60 days. Expiring access in this way is generally done to circumvent situations where a 3rd party was granted permission long ago and you haven’t gotten around to barring them access to your data.

To re-authenticate, click “Settings” on your LinkedIn feed, then select your authentication method as pictured here. (the email option is typically intended be used only for getting a link to someone who does not have access to CloudPreservation)

As the feed’s owner, you will be notified again when the authorization period is expiring. Failure to (re)grant that permission will result in CloudPreservation no longer being able to obtain/capture your data from LinkedIn. As always, the actual authentication involves CloudPreservation sending you to a LinkedIn server to provide your credentials and grant the permission. CloudPreservation will never obtain or store your personal LinkedIn password.

As an instance admin for CP, keeping track of your user’s crawl authorization statuses can be tricky. Notifications are sent to the user to let them know the social media site requires them to provide re-authorization, but if they don’t take action… you need to know to get on their case! In order to alleviate this issue we have added a new field to CP cases:

A look at the new changes in the CP tab under account management

This field is available under the CP tab in “Manage account”. The list takes a group of comma separated emails of whom you want to be notified whenever authorizations or reauthorizations are needed. This makes the process more visible and you won’t have to mentally remember to remind Bob down the hall he needs to reauthorize his twitter crawl. The field can be either set at the same time the instance is made or later modified at any time from the CP tab using the “Modify Repository” utility.

The BCC field can be changed at any time from the Modify Repository tool.

The other new CC function is small, but helpful. When either uploading a new file or a new version of a file in TC you can now CC yourself alongside the users you want to notify. This will help you better keep up on TC file changes. The recipient list will be included in the message delivered to each user, allowing an easier audit of who received the notification.

Facebook has recently announced that they will start expiring access tokens after 60 days. These tokens are used for Cloud Preservation to crawl Facebook feeds that require user authorization. To ensure proper crawling of your Facebook feeds you will need to reauthorize Cloud Preservation every 60 days.

This can be done by going to the feed’s settings and selecting to either enter your Facebook credentials or emailing the user who originally authorized the feed.

After saving the changes you will either be directed to sign into Facebook or an email will be sent to the email address you entered.

As a feed owner you will be notified 5 days before tokens are scheduled to expire, and again the day they expire, if you have not already reauthorized.

While this is may be seen as a nuisance, we understand that Facebook is taking this step to improve user security and we will do our best ensure the process is as smooth as possible.

We are excited to announce new options for crawling Facebook. We have added the ability to crawl profiles anonymously, crawl friends’ profiles, and crawl age and country restricted pages. Also, in an effort to make crawling Facebook easier we have combined page and profile crawl creation.

We now offer the option to crawl a Facebook profile anonymously. This will allow you to capture a Facebook profile’s information that is available to all Facebook users. You can do this by simply entering the address of the profile you want crawled, then selecting the “Anonymous User” option for who to crawl as. Simply complete the form and we will begin crawling. If you need to capture more information than is available publicly, and are unable to get the user’s credentials, we have added an option for that also.

You now have the ability to crawl friends and other profiles as they are available to you or anyone else who is willing to authorize the use of their credentials. This can be done by entering the address of the profile you want to crawl then selecting either the “Enter Facebook credentials” or “Email other Facebook User for credentials” options and proceed with authentication. We will then begin crawling the specified profile as if we were the user who authorized the crawl.

Facebook fan pages can be restricted based on the age and country of the viewing user. You can now crawl these restricted pages by selecting to crawl the feed with Facebook credentials that have the proper permissions to view these pages. When creating the feed, select either the “Enter Facebook credentials” or “Email other Facebook User for credentials” option and proceed with authentication. Once authentication is completed we will be able to crawl the page as if we were viewing the page as the authenticated user.

In addition, you no longer need to know if you are crawling a Facebook fan page or a Facebook profile. You can create crawls for both using the same process. Simply select the “Facebook Feed” options from the “Add a New Website or Social Network Account” dropdown menu. From here you can simply enter the ID of the Facebook object you want to crawl, select who you want to crawl the object as (more on this later), then fill out the remained of the options just as before. After authorization, if it is required, we will start crawling Facebook.

We hope the streamlining of the Facebook Feed setup will save you time when setting up new feeds. And we hope these new features will allow you to capture all the Facebook information you need.

Last night we rolled out some improvements to help users switch between different accounts and product instances in Trial Cloud, Discovery Cloud, and Cloud Preservation. With more and more customers taking advantage of all of the Nextpoint applications, as well as the introduction of the Nextpoint’s WIRE technology, we know that the list of Nextpoint product instances that a user may have could get unruly.

So, to help keep that organized, we’ve updated the change instance drop down (screenshot below) to only include your product instances for the current account. For those of you that have access to more than one account, we’ve provided a link right next to the account name that allows you to switch accounts. And finally, for account administrators, we’ve moved the account administration link into this switch instance drop down (it was previously in the drop down that shows when you click on your user name in the right-hand corner.)

Switch repositories (click to enlarge)

We’re hoping these changes help keep you organized as the number of your Nextpoint product instances grows.

The account dashboard is your tool for keeping up to date on how much data you’re storing in your Trial Cloud, Discovery Cloud and Preservation Cloud repositories. Each product dashboard provides an overview of the data used by each of your repositories as well as a product-wide gigabyte sum.

The numbers shown for each repository are the averages of all the records for the time period you are viewing. We run our storage calculations twice daily – once in the morning and once in the evening. You can view a repository’s daily usage by clicking on the repository name. The daily usage records shown are the maximum of the two storage numbers for that day in gigabytes.

The Cloud Preservation dashboard includes feed counts as well as storage numbers and presents these in the same fashion.

A note on document deletion: We wait a full day after a document has been deleted to fully purge it from the system. This gives us the ability to restore the document quickly if it was incorrectly deleted. This may cause some lag in the reduction of gigabytes used per day, but have no fear the reduction will be recorded.

Managing storage can be a daunting task and we strive to be transparent about the amount of data you are storing in any of our products.