Saturday, March 30, 2013

I am reading the excellent "Wrox Professional Oracle WebLogic server", it recommends to always assign all services (Proxz Services in OSB) a Work Manager with Max Thread Contraint = max Connection in Pool for any DS it's using. So far so good.

What I didn't know is that in older ( <12 ) versions of weblogic if you name the MaxthreadConstraint as the Datasource, the max number of threads will adjust automatically to match the Maximum Capacity of the Datasource. This is cool.
In WebLogic 12, you can do this explicitly in the console:

Sunday, March 24, 2013

Extraordinarily well written and witty, this book reveals how bad sanitation (=WC) is the highest cause for child death in the world, being the root cause of diarrhea, which kills much more than AIDS, cholera and malaria together.

The author has traveled the world investigating the cultural and political aspects of sanitation, and addresses this rarely mentioned topic with great technical knowledge.

Silent Spring is a old book but still very actual, on the topic of the poisoning of all creatures by human waste - mainly pesticides, herbicides (=biocides) and fertilizers. A must read for all ecologists.

Java compiler level does not match the version of the installed Java project facet.
Faceted Project Problem (Java Version Mismatch)

Don't panic: it's normal with Maven that things take 10 times more than normal and require a lot of manual hacks and desperate googling.

Just right-click on the project in Eclipse, Project facets,
and set it to 1.6 (I assume your Workspace default is 1.6). For some strange reason Maven defaults to 1.4. My grandmother - deceased in 2009 at the age of 97 - used to code in 1.6, so I guess that Maven authors are older than her - maybe they are Trilobites, who ruled the Earth some 500 million years ago. Welcome to the 21st century, pals.

"HTTP servers sometimes use compression (gzip) or deflate
methods to optimize transmission. How both chunked and gzip encoding
interact is dictated by the two-staged encoding of HTTP: first the
content stream is encoded as (Content-Encoding: gzip), after which the resulting byte stream is encoded for transfer using another encoder (Transfer-Encoding: chunked).
This means that in case both compression and chunked encoding are
enabled, the chunk encoding itself is not compressed, and the data in
each chunk should not be compressed individually. The remote endpoint
can decode the incoming stream by first decoding it with the
Transfer-Encoding, followed by the specified Content-Encoding."

How can this "chunked" happen?

Is there any HTTP Business Service along the path?
If yes, we should check if it has got Chunked Streaming Mode disabled:

Oracle documentation states that the Chunked Streaming Mode property should be selected "if you want to use HTTP chunked transfer encoding to send messages." You normally want to enable chunked streaming if possible (with my problem above, it is not possible).

Chunked transfer encoding is an HTTP 1.1 specification, and allows clients to parse dynamic data immediately after the first chunk is read. Note that the Oracle documentation also states not to enable chunked streaming if you use the Follow HTTP Redirects option, as redirection and authentication cannot be handled automatically in chunked mode.

Saturday, March 9, 2013

The first is my favourite book, the first book I would grab if I was to be sent to exile in a desert island. It's more a meditation over human nature than a detailed account of the Buna-Monowitz work camp: http://en.wikipedia.org/wiki/If_This_Is_a_Man (Ist das ein Mensch, Se questo è un uomo) by Primo Levi,

probably one of the most intelligent men of his generation, a Chemist and a literate. They say he committed suicide in 1987 but it's bullshit, he was under medication and he lost balance and fell. The companion book "The Truce" is also very worth a reading, telling how after the liberation from the Red Army he came back from Russia to Italy basically by foot.

Another book I am reading is From the Ashes of Sobibor, by Thomas Blatt; the literaly style is very simple by it really captures you. From which you learn that most Ukrainian and Polish people - mostly Catholic - were very happy to denounce and rob Jews - the exceptions were very rare.

Then there is Five Chimneys, a sober and detailed account of a woman's struggle for survival in Birkenau. Maybe not as good as the previous book, because it focuses exclusively on the details of the concentration camp, leaving apart any political analysis, but still a very valuable and sobering source of information.

With this trigger, and modification will generate an extra entry in the PVLOG table.

Now I want to distinguish the operation being done:

CREATE OR REPLACE TRIGGER PVTEST_TRIGGER
BEFORE DELETE OR INSERT OR UPDATE ON PVTEST
FOR EACH ROW
BEGIN
IF DELETING then
insert into PVLOG (PVSUMMARY) values ('DELETING');
END IF;
IF INSERTING then
insert into PVLOG (PVSUMMARY) values ('INSERTING');
END IF;
IF UPDATING then
insert into PVLOG (PVSUMMARY) values ('UPDATING ');
END IF;
END;
/

in the case of an UPDATE, you have the 2 variables :OLD and :NEW pointing to the old and new record.

The real pity is that there doesn't seem to be a way to declare a trigger for ANY table in the schema, and retrieving the table being affected with a :TABLE variable. You can define SCHEMA TRIGGERS, but they cannot be defined to catch INSERT DELETE UPDATE events.

If the config jar file was exported at the resource level: if the resource is not in the jar file but in the domain: Skip (No-op)

if the config jar file was exported at the project level (this behavior affects only the resources in the projects that are found in the jar file.) if the resource is not in the jar file but in the domain: Delete