So far the solution was not to map these columns to begin with when adding them to the data store. This led to BODS creating SQL that did not apply any values to these columns, and SQL Server used the default value.

Code:

/*1 - insert a row with out the column that has the default constraint*/
insert into testDefaultValue (Col1, Col2,Action)
values (1,'something','insert a row with out the column that has the default constraint');

select *
from testDefaultValue as tdv;

However, we want to be able to do initial load, and here we would like to provide the values for these columns for the full load data flows only. Therefore we must add these columns to the data store definition.

What I need to know is how I can make BODS to use the “default” key word in SQL, as a value for a column:

Hi - I'm not aware of a way to make DS work like the 3rd query. SQL Server identities and defaults are a hassle to work with in Data Services. Typically I will bring the table in as-is and have separate branches in my DF for insert and update. For example, if there is an identity column, on the insert branch I will not pass that column to the target table.

What objects are you working with in Data Services to populate the target table?

Source is a table object (real Oracle table acting as stage).
Target is another table object, also real Oracle table, but with some columns having a default value defined for them. And on ETL these columns supposed to received the default values. Also, the target table have an auto-incremental field which acts as a surrogate primary key.
So the DataFlow looks like this: