Recently I had a
requirement to restore the phone to the factory default and copy only Contacts,
Notes and Messages because of the LowMemory issues with my phone. I have faced
lot of issues after the iOs 7 upgrade and LowMemory issue was one the one of
the major one. It used to crash at least one or twice because of the low memory
issue and it used to log in diagnostic report.
(Settings->General->About->Diagnostics&Usage->Diagnostics &
Usage Data). To fix that issue I restored the phone to the factory default and
restored only contact, message and notes.

I have done the iOS 7.1.2 upgrade in my iPhone 4 recently. My iPhone was pretty
stable with the iOS 6.0.1. Due to some mandatory requirement for the OS upgrade
I have performed the iOS 7 upgrade. After upgrading my iPhone 4 to iOS 7, I
feel there are lot of positives and negatives. The iOS 7 gives new look and
feel to the phone but I lost my stability with the phone. Lot of performance
issues started with the phone like it won’t respond to our touches as looks
like the phone is quite busy. Sometimes the apps are getting crashes.

The database was working perfectly
and there was no major changes done in the database. Database is in Open mode
(read write); But still the describe the data dictionary views like DBA_tables
gives following error.

In v$datafile there
is a field called creation_time and this field is having the datafile creation
date. Suppose if you are database's space addition generally happening by
adding the data file then you can easily find out the database growth using
this column.

This method will not
be effective in the following cases.

a. If you are resizing
the existing file this method will not be effective as it records the datafile
creation date.

Sometimes DBAs are asked to create series
of users of same pattern like xyz01 to xyNN. In such situations it is very
difficult in case we are approaching with manual user creation. You can use
below enclosed script for this purpose.

It is a common question to the DBA
from the developer that is there any wayto
find out the rows inserted or modified recently without having the date column
in the table. If you have date column inside the table it is easy job to find
out the data. If you have flashback feature enabled in your database here is
the way to find out. This will work out up to certain extend.

Oracle data pump has one option
called nologfile and by default its value is N. The purpose of this option is
that if you specify the value for this parameter as Y then it will not create
the log file for the export or import activity. This is very useful option when
you do an import where the oracle user doesn’t have write privilege on the
directory where the dump file exists. In such cases the impdp will throw the
following error.

This is the solution to find out the
Oracle user current password by recording the password history. This would be helpful
to find out who changed the password for a user, when he changed, for which
user with new password. From the recorded history you can find out the previous/old
as well as current password for the Oracle user. This is achieved by oracle
autonomous_transaction and profile PASSWORD_VERIFY_FUNCTION.

If a table has large number of
records and the particular table is getting updated or the rows getting deleted
periodically then there will be unused blank spaces (holes) in the table
segments. These blank spaces will get created by the row deletions which will
not be used without reorg or reclaiming. Over a period of time the volume of
the unused space will get accumulated to a huge size. These unused spaces are
called fragmentations. The oracle will not automatically release this space
into usable free space whereas we have to perform the reorg activity to claim
the fragmented space.

Table shrink is an interesting
feature provided by Oracle and this is very useful for the database reorganizations.
There are 3 different options available with table shrink command. The table
shrinking is achieved by moving the rows between the Oracle blocks. To perform
the table shrinking we need to enable the row movement for the table.

Database reorg is the major activity
for the Oracle DBAs. Oracle provides shrink table feature to reclaim the space
within table after doing the deletes. Shrink is accomplished by rearranging or
moving the rows between the oracle blocks. To achieve this table shrinking we
need to enable the row movement feature which is associated with the table.

This is an interesting feature. This
is a very useful feature to test a Oracle procedure without committing the
actual changes made by the procedure. If you disable the commit in procedure
you will get an error while executing the procedure.

Google AdSense introduced EFT / wire
transfer for the Indian publishers. You can upgrade your payment mode to the
new system with the following steps. Suppose if you have already updated the
bank account once and if you wanted to change it to some other account also you
can refer this following steps.

Follow these steps to change your
publisher bank account details to new one.

This is a question or requirement
which I got from one of my blog reader. “How to import only the rows does not
present in the targeted table?”. The rows present in the table should not be
deleted and import only rows which does not exist in the database target table.
This can be achieved by using the combination of table_exists_action and queryoptions.

Here I have given an example with
normal import and with data pump API import.

Suppose if you database size is big
or the size targeted data for the export is too high, you cannot accommodate the
dump file in single file. Sometimes the OS will not allow you to create such
big files. Also, the available space in one file system or directory is not
enough to hold the dump file you can use the FILESIZE parameter to split the dumpfile
into multiple pieces and spread across different file systems during the export
itself.

The cycle option repeats the same set
of sequence values once it reaches the max value. The next value will be reset
to minimum value once it reaches the max value. The cycle option will end up with error if you
are using this sequence for the primary key values as it won’t allow the
duplicates.

The oracle sequence by default
creates with nocycle option. The nocycle option will give you an error message
when you try to fetch next value after reaching the max value.

This parameter is
used to identify the tracefile easily from the directory. When you specify or
set this parameter with a value that value becomes the part of the trace file
name. So that easily can be identified from the directory. This is a session modifiable
parameter and each time when you modify the parameter value the name of the
very immediate trace file will get effect of the same.

This parameter is
useful when you wanted get the approximate size of the dump file. Suppose you
have only limited space available in the disk and if you wanted to check
whether your dump will be fit into the available space then you can use this
parameter to check or estimate the size of the dump with doing the actual
export. Some facts are here about estimate_only
parameter.

There are couple of
options to find out the database name for the current session. If you are a DBA
a privileged user/session you can query any data dictionary views. But it is
not the same case for non-dba user session.

This is an
undocumented parameter for the data pump. The Matrics=y option will log the list
of objects exported or imported in each category and the time taken for each
category objects. All the details will be logged in the logfile. See an example
below.

Using SQL Developer
4, you can initiate the data pump export or import job. It has export wizard
under the menu view -> DBA. Using this you don’t need to aware about the
data pump options to invoke a data pump job. All those data pump options can be
invoked through the wizard. Here I will show you an example for one export job.

Most of the legacy
mode options will work in the data pump with its corresponding options. In
legacy mode (imp utility) we have show=y option to spool the content
of the dump file into sql scripts without doing the actual import. The same
command can be used in data pump impdp also but it will get replaced with sqlfile option.

In the same oracle
schema you can create 2 tables with same name one is in small case and second
one with upper case. By default all the DML, DDL command applicable to the
table name with upper case. If you wanted to access the table with lower case
you have to specify the table name with double quote (“)

There are two ways
or two dynamic views to check this -v$license and v$session

The sessions_current
column from the v$license gives the current count of the users in the database.
This count will not have the oracle by default connection whereas it has only
the application user connections are on live as of now.

We generally use the
oracle sqlplus spooling feature to log the output of the sql commands. If the
output of the SQL command is too big in size, in such cases better to
zip/compress the spool file. Again the available free space in the file system
is too low to make the zip after creating the spool file is bit risky.
Sometimes there will not be room for hold the actual and zip file during the
compress process.

Sometimes it is
better to give database/schema and timestamp for the spool files which get
created from the sqlplus prompt. If you are keeping all the spool files in
single directory this method will be useful to pick up the right spool file out
from all files.

Here is the method
to spool a file with database name, schema name and current timestamp.

There are chances of
mistake in life and the same kind of mistakes can happen in the DBA life also. The
mistake on the production database is not advisable. J
But anyway, there are chances of adding special characters by mistake in the
data file name while adding the datafile to the tablespace. You can rename the
datafile using following steps.

Here in this example
I am adding one datafile with special character and new line character.

Generally we use
sqlplus spool command to record the sql command outputs to a file. It will
record the command outputs between spool
on and spool off. Generally once
you done with the spool off command the output file will be closed from the
further logging.

Oracle 10g onwards
one more option added to the spool
command to append the sql command to an existing file.

Blogger custom
domain address is not working without www. It is giving “Not found 404” error
code.

Solution Description:

Problem is quite
critical with respect to a web site as the site will lose good amount of
traffic because of this error. But the solution is simple just to make some
settings change in the blogger which will fix this errorJ.
Here is the solution.

BESCOM has recently enabled
many of the rural places into online payment. You can use www.bescom.org
website for the online payment which is very convenient. It is happening through
Billdesk which is having a nominal service charge as mentioned below.