Two weeks ago I had the pleasure to give a talk about how to use Big Objects & Async SOQL to build your Big Data solution in Salesforce, with my super colleague Agustina García, at Dreamforce. You can check the video of the talk here.

In this post I want to highlight the 10 key things to remember about Big Objects, one of the features we talked about in the presentation.

#1 Big Objects leave in a different infrastructure, built to support large volumes

Relational databases start to break down at millions of records. For that reason, Salesforce decided to create a brand new infrastructure to support them. This new infrastructure is highly scalable and it is built with proven big data technologies:

#2 But… they are well integrated with Salesforce

Despite they live in a different place, you can still work with them using most of the tools that you are used to working with in Salesforce.

They support relationships with custom or standard objects

You can query them with SOQL (although with some limitations) and create them with Apex (using Database.insertImmediate). This means although they are not available though the Standard UI, you can always create a Visualforce page or Lightning Component to interact with them.

You can setup CRUD&FLS for them

You can create Big Object records through the Bulk and SOAP APIs

#3 They don’t count towards storage!

Yes, you are reading well, they don’t. We still don’t know the price that this new kind of storage will have, however we know that for sure it will be less than normal storage price.

#4 You will need to define an index, that will be the unique identifier for the records

From API 40.0 defining an index is compulsory. This index can be compound, of up to 5 fields, and it will be the primary key or unique identifier for the records on the database.

The index will play an important role on the Big Object records live, as:

If you try to insert a record for which there is already an index in the database, the existing record will be updated, instead of a new one created. So insert acts really as an upsert when talking about Big Objects.

If you want to query Big Object records with SOQL, you will be able to filter by / order by the fields specified in the index, and in the exactly the same order. Any other filtering / ordering won’t be possible. Currently, Salesforce is working hard on giving us less restrictive synchronous queries [safe harbor].

Important: Once a Big Object is deployed, you cannot change its index.

#5 Once records are created, they cannot be deleted

For now, there is no standard way to delete them. However, you can always open a case with Salesforce support and they will help you. Fortunately, it will be possible soon to delete a record by specifying the complete primary index values [safe harbor].

#6 Database operations are not transactional

As when inserting / updating records, you are hitting a different database, different from the Salesforce Oracle one, this kind of operations are not transactional. This means that in the situations where Salesforce or yourself would normally do a rollback, Big Object database operations won’t be rolled back, as opposite to normal DML operations.

Platform Events have also this behavior. Check this interesting session from Andrew Fawcett about how to take advantage of this fact to build advanced logging patterns in the platform.

#7 Big Objects are packageable

The metadata type for Big Objects is the same one than for Custom Objects, with the only difference that their API name ends in __b. So you will need to select the “Custom Object” component type when packaging, and just select the Big Object that you want to package.

#8 Big Objects metadata can be propagated to / from Sandboxes

Big object metadata is copied as a part of generating or refreshing a sandbox. However, data in the big objects is not.

Similarly, big object metadata can be propagated from sandbox to production via metadata API or Change Sets.

#9 Search, Reports & Dashboards are not allowed for Big Objects

As Big Objects are designed to support very high volumes, these features are not available for now, and I don’t know if they will be in the future, as you have seen that even SOQL filtering is very limited for now. We will need to wait to see what direction does Salesforce take.

However, there are two solutions that you can take to overcome this:

Use Einstein Analytics, with which you will be able to report on Big Object records. Ask Salesforce if you want to know more about this.

Summarize the info that you need to report on with Async SOQL and move it to an intermediate custom object you can report on. This is the “Predictive Analysis” use case we presented at Dreamforce. I will publish soon a post talking about this use case in detail and link it here.

#10 Not all the field types available, but there are workarounds

The only field types that you can create on a Big Object are:

Text and Long Text Area

Number

Datetime

Lookup to Standard or Custom Object

I will also publish soon a post talking about these workarounds in the detail and I will place the link here.

Finally I would like to highlight that the information I give on this post is valid for Winter 18 release. If you are reading the post some time later, please check the release notes because some things may have changed.

Great summary and bonus points for including disclaimer about which release this applies to.
There is nothing more irritating than seeing outdated info, especially when there isn’t even a timestamp when it was published (I am looking at you, Salesforce Help pages).