If you have a look to the execution plan and the profile you see why both functions will use different execution plans!

The query processor replaces an inline UDF reference with its definition; in other words, the query processor “expands” the UDF definition and
generates an execution plan accessing the underlying objects and can use the statistics!

So - from my point of view - multi-statement-function maybe harmful :)

>> you program better the following AdventureWorks2012 table-valued function? <,

The DDL stinks and is full of ISO-11179 violations. Heck, nobody even got the word about Pascal and camelCase! For example, the USPS allows 35 characters for an address, not fifty characters from an old ACCESS and BASIC default value. Nobody would use
an integer for a tag number as that silly generic “duns” (what role does it
have in the data model? Can it be lawful person? Etc?). AdventureWorks was meant to be a demo for every possible feature in T-SQL they could load into it.

Unfortunately, they had no idea how to do a valid data model, so it sucks. Instead of “Personnel”, the name of the abstract set, they used the singular noun “Employee”, “Person” is both a database and a table, etc.

There is no such thing as a “type_id”; the attribute property can be a “<something>_id” or a “<something>_type” not a monster hybrid. What is your “blood_type_id_value_code”? See how silly extra attribute properties are? And since there are not
a lot of types in this nominal scale, it ought be a CHECK() clause, not a table.

The DUNS is how a business is identified; apparently this is also how they identify their personnel. Otherwise, we have Kabbalah numbers and magic.

The most portable way would be to create “Automobiles, Squids and Lady Gaga” VIEW from the fragmented universes (every schema is supposed to be its own little universe of discourse) in this non-data model. A very rough skeleton would be simple: UNION ALL the
contact attributes from the tables that have been corrected to model entities, instead of a forced OO concept of data with Person as a super-type.

This how the original "FN-" prefixed line of code was implemented in FORTRAN II and then in 1960's BASIC! It was literally a text macro substitution precompiler command. Did you ever wonder where that "FN" came from?

I don't recall anything like that in Fortran II or Dartmouth Basic, and Google doesn't turn anything up in a quick search. Platform? References?

And I have nothing against macro substitution anyway, I wish to heaven the SQL Server had it to implement constants, though there are other ways to implement constants as well. I could think of some other large benefits to using macros in SQL as well.
Totally RDBMS, if it were only a little better than it is.

All replies

What I think is that, we can avoide UDFs only when we have alternate solution or sql statement giving the same result. Almost every programming languages has the flexibility of writing user define functions becouse
of the following reasons,

The short answer is "Yes" of course. Remember when people asked "can the GOTO be avoided in procedural programming?" before the Structured Programming revolution? Well, welcome to declarative or functional programming!

I don't believe UDFs gives you any capability you cannot do without them.

Their main benefit is the same as any subroutine or function in any other language, they let you encapsulate certain logic for reuse.

Their main drawback is that Microsoft never quite figured out what they want to *do* with functions or where they fit in the SQL model, their interference with query optimization is horrible, and I think largely unnecessary. It all centers on what
is "stochastic" versus "deterministic" and what those mean in a SQL environment.

Joe's major complaint is that they are proprietary syntax and not ANSI approved. Well, that's as may be.

They are handy and pretty, I use them in limited circumstances and with knowledge in advance of their drawbacks. There are lots of things in SQL and relational that are less than perfect but overall are worth the bother, UDFs are just another
on that pile.

If you have a look to the execution plan and the profile you see why both functions will use different execution plans!

The query processor replaces an inline UDF reference with its definition; in other words, the query processor “expands” the UDF definition and
generates an execution plan accessing the underlying objects and can use the statistics!

So - from my point of view - multi-statement-function maybe harmful :)

>> you program better the following AdventureWorks2012 table-valued function? <,

The DDL stinks and is full of ISO-11179 violations. Heck, nobody even got the word about Pascal and camelCase! For example, the USPS allows 35 characters for an address, not fifty characters from an old ACCESS and BASIC default value. Nobody would use
an integer for a tag number as that silly generic “duns” (what role does it
have in the data model? Can it be lawful person? Etc?). AdventureWorks was meant to be a demo for every possible feature in T-SQL they could load into it.

Unfortunately, they had no idea how to do a valid data model, so it sucks. Instead of “Personnel”, the name of the abstract set, they used the singular noun “Employee”, “Person” is both a database and a table, etc.

There is no such thing as a “type_id”; the attribute property can be a “<something>_id” or a “<something>_type” not a monster hybrid. What is your “blood_type_id_value_code”? See how silly extra attribute properties are? And since there are not
a lot of types in this nominal scale, it ought be a CHECK() clause, not a table.

The DUNS is how a business is identified; apparently this is also how they identify their personnel. Otherwise, we have Kabbalah numbers and magic.

The most portable way would be to create “Automobiles, Squids and Lady Gaga” VIEW from the fragmented universes (every schema is supposed to be its own little universe of discourse) in this non-data model. A very rough skeleton would be simple: UNION ALL the
contact attributes from the tables that have been corrected to model entities, instead of a forced OO concept of data with Person as a super-type.

The query processor replaces an inline UDF reference with its definition; in other words, the query processor “expands” the UDF definition and
generates an execution plan accessing the underlying objects
and can use the statistics!

This how the original "FN-" prefixed line of code was implemented in FORTRAN II and then in 1960's BASIC! It was literally a text macro substitution precompiler command. Did you ever wonder where that "FN" came from?

This how the original "FN-" prefixed line of code was implemented in FORTRAN II and then in 1960's BASIC! It was literally a text macro substitution precompiler command. Did you ever wonder where that "FN" came from?

I don't recall anything like that in Fortran II or Dartmouth Basic, and Google doesn't turn anything up in a quick search. Platform? References?

And I have nothing against macro substitution anyway, I wish to heaven the SQL Server had it to implement constants, though there are other ways to implement constants as well. I could think of some other large benefits to using macros in SQL as well.
Totally RDBMS, if it were only a little better than it is.

No, that they do not optimize. If they optimized, I could live with non_ANSI as ana excuse. But is it #2 :)

Hallo Joe,

if it is an inline function it will optimize as my example had demonstrated. I'm not a fan of UDF and yes - most (if not ALL) - set based operations can be handled with views in the same way but inline functions may give a benefit because it can use variables
as parameters (like predicates) and can optimize the query by using the underling objects.

The responsibility of coding is by the developer! The tools will be given by Microsoft and the developer has to learn how to use them :)