I think a lot i.e. the humanitarian sector has to invest a lot in order to produce effective innovation because that is the price of good innovation. If that was not the case, good innovation would be easier to come by, and that is why you see in the private sector – where you find arguably the most effective innovation – a lot of money put towards research and development.

Basically, the higher the risk, the more the reward. Conversely, the more the results from innovation, the more the risk or stakes, and therefore, more the investment betting on it.

So now the real question is: How can the humanitarian sector – where resources are limited – then pursue effective innovation?

And I think the answer to that lies in focusing on the other side of capital i.e. the human capital. The humanitarian sector is certainly short on funds, but it is full of passionate people. So it should invest a lot in supporting teams of self-driven individuals who can produce result-driven innovations.

Last year, I presented on “Esri WebGIS Platform – How we implemented ArcGIS, and you can too” at FedGIS. This year, I shared another summary – lessons and tips from that implementation. That is especially helpful if you are dealing with the unique security responsibilities of the federal government around high-value PII/PHI-based data assets and Expedited Life Cycle (XLC) processes.

From a technical perspective, I shared how we implemented a hybrid and disconnected ArcGIS design inside a 3-zone architecture with multi-VPN and multi-NIC networks on Red Hat Enterprise Linux.

From a high-level management perspective, I shared how that played out inside the federal environment.

The talks on ArcGIS Server at ESRI Health GIS were fun, but I wanted more – specifically, to install and administer its latest release on Amazon Web Services, all via the trusted command line. Here’s how I did that:

To follow along, get an EDN license and an AWS account. Especially, if you have been in the industry for long, there’s no good excuse to not have those with the biggest companies in GIS and da Cloud (and while you are at it, get MapBox and CartoDB accounts too).

### Setup the stage #### Downloaded its AWS key from //aws.amazon.com/console/ and connected to my instance (ensured it matched the min. system requirements) using its public DNS (if you restart your instance, this will change). Note I SSHed using Cygwin instead of PuTTy.$ ssh -i "key.pem" ec2-user@#.#.#.#.compute.amazonaws.com$ cat /etc/redhat-release
> Red Hat Enterprise Linux Server release 7.1 (Maipo) # Even though I used RHEL-7.0_HVM_GA-20141017-x86_64-1-Hourly2-GP2 by Red Hat (I later found out that ESRI provides its own AMI) $ sudo yum upgrade$ sudo yum update$ sudo yum install emacs # For that college-dorm smell, no offense Nano/Vi$ sudo emacs ~/.bashrc
force_color_prompt=yes # If you haven't already... (Ignored the embedded rant and uncommented this line to make the prompt colored so it was easier to read in-between)

### Setup the instance #### I used a M4.LARGE instance with a 20GB EBS volume (in the same Availability Zone, of course) - ensured it didn't go away if I were to terminate the instance. Then, I extended the partition to exceed the min. space requirements (took a snapshot first) - unfortunately, AWS docs didn't help much with that.$ df -h
> ...$ lsblk # Listed block partitions attached to the device. Since there was a gap in sizes between the partition and the device (and there were no other partitions), I resized the child partition "XVDA2" (the root file system where I would finally install ArcGIS Server) to use up the surplus space on its parent disk "XVDA".
> NAME SIZE TYPE MOUNTPOINT
> xvda 20G disk
> |_xvda2 6G part /# First, updated its metadata in the partition table$ sudo yum install gdisk # Since disk label was GPT$ sudo gdisk /dev/xvda/$ print # Noted the start sector$ delete$ new$ #### # Used the same start sector so that data is preserved$ \r # For the max. last sector$ # # Used the same partition code$ print$ write$ y# Next, updated the actual XFS file system$ sudo xfs_growfs / # This is the actual change for XFS. If 'df -T' reveals the older EXT4, use 'resize2fs'.# Then, confirmed to see if the boot sector was present so that stop-start will work$ sudo file -s /dev/xvda # Bootloader# Finally, rebooted the instance to reflect the new size$ sudo reboot

### Authorize, authorize, authorize! #### Created and uploaded authorization.txt, and downloaded authorization.ecp from //my.esri.com/ -> "My Organization" -> "Licensing" -> "Secure Site Operations"$ locate -i authorization.ecp$ readlink -f authorization.ecp$ ./authorizeSoftware -f /path/authorization.ecp$ ./authorizeSoftware -s # s=status, not silent$ ./startserver.sh$ netstat -lnp | grep "6080" # Confirmed owned processes - that it was listening on the default TCP@6080 (port is only required if you don't have the Web Adapter)# Ensured IP and domain were listed correctly in the hosts file (e.g. Single IP may be mapped to multiple hosts, both IPv4 and IPv6 may be mapped to a single host, etc.)$ hostname$ emacs /etc/hosts$ 127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4$ ::1 localhost localhost.localdomain localhost6 localhost6.localdomain6$ #.#.#.# localhost localhost.localdomain localhost4 localhost4.localdomain4# But wait, before I could browse to my site from a public browser, I needed to add this Inbound Rule to the Security Group attached to the instance
Custom TCP rule TCP 6080 0.0.0.0/0

### Back to the browser ###
//#.#.#.#:6080/arcgis/manager/# At the end, added SSL using a self-signed certificate
//#.#.#.#:6080/arcgis/admin/
Custom TCP rule TCP 6443 0.0.0.0/0 # Added this rule to the group on AWS first

After years of doing this with first ESRI (PROD), then MapServer (PROD) and GeoServer (DEV), I went back to the dark ahem ESRI side. And what do I keep finding? That the big two are blending together in terms of looks. E.g. The console of the other Java-powered mapping server, GeoServer, is looking similar to that of its big brother on-steroids. The third, MapServer, somewhat paradoxically on the other hand, has both come a long way (MapCache and ScribeUI, yay!) and still lost ground.

Next up, testing Tippecanoe.

PS:
* I tried both 10.3.1 and 10.0 on Ubuntu (15.04), unsupported. While both installed, site creation didn’t work because of missing packages – searching through apt-cache didn’t help either. On Windows, there is always their CloudBuilder.

Clearly, there’s no shortage of health data or technologies, esp. following ACA’s requirements of uniform data collection standards, just a continuing kerfuffle with overlaying disparate JSON/OGC tiles from their many data owners and manifold service endpoints. Unfortunately, only part of this problem is technical. Take Flu mapping, for instance. CDC, WHO, WebMD (with MapBox) and Google, even Walgreens does it. Or take HIV mapping where you can choose from CDC and NMQF, among others. Even anonymized private claims data is available for a couple of Ks a month. I think a bigger part of the problem is the misalignment between vendors’ business interests and mandates of variousagencies and goals of the health research community at large.

Connect

At some point, researchers and epidemiologists would want to see how these data tiles correlate to each other. And GIS professionals would want a quicker way to ‘overlay this layer’ with out having to dig through Firebug. And compress it over the wire, while you are at it (when our users in remote Africa were asked to switch off their smartphones to view desktop maps, we understood data compression a little differently).

Crunch

And then they would want to analyze them, be it on the server with BigData or in the client with smallerones. On analyses, your favorite GIS continues to take heat from tools like Tableau among conference attendees.

Overall, a growing use of ArcGIS Server’s publisher functionalities and a compelling body of story map templates leveraging its narrative text capabilities. E.g. Atlas for Geographic Variation within Medicare. On publishing, I suspect some researchers would like to see a Mapbox plugin for QGIS. Yes, you can render and uploads maps from TileMill to your Mapbox account, but CartoDB has QgisCartoDB where you can view, create, edit or delete data from QGIS to your CartoDB account (I needn’t add that Python-powered QGIS remains a favorite among matplotlib-loving researchers).

++ While log analyses attest that mono-themed web maps provide a better user experience, given the nature of health data and the costs behind spinning off another mapp (yup, blended words to make a portmanteau), sometimes you just have to combine themes.

Such a planning methodology of data collection and projection does have some intrinsic faults: it relies heavily on knowledge-based skills. It assumes that ‘correct solutions’ to social problems can be obtained from a scientific analysis of various data. It must be noted that a solution-driven approach and heavy reliance on physical sciences as opposed to social sciences, is inherently inaccurate since the ‘best planning answer’ is a non-existent variable, changing with time, society, culture, resource availability, etc. And there is always a danger of being consumed by this technique, and confusing the result for a solution.

The nature of this study involved making some basic assumptions about the way our study-area could evolve in the not-so-distant future. There have been doubts raised about the correctness of such a clinical technique wherein an urban settlement is ‘stripped’ of its various attributes, and these attributes then individually graded. Appreciation of the intricate complexity of human society, where each individual is a separate factor, is absent. Lack of importance to these inter-relationships is a flaw of such an analysis.

For E.g. In the current study, if we were to discover one other attribute, say a desert, how would it affect the final map? We would, using this approach, simply grade each cell one more time. Then we would add this new map to our list of maps, and calculate the new final map. However, we would fail to evaluate how the addition of a desert affects each of the other attributes individually.

But this flaw may not be as aggravated as it seems. Each cell gains its final value from all attributes. If in a hypothetical case, one could gather a ‘complete list of attributes’ that would impact future growth, and assign them ‘correct values’ (without even breaking them into distance-bands which are only for convenience), finally adding them in the ‘right equation’, one would come up with a case-specific fairly accurate growth forecast (however, even then, any sudden future changes would still get missed).

There have also been some other approximations:

* The integer weights assigned to attributes.

* Or, areas outside the study-area that exert significant impact on urban growth, but were ignored because of study limitations.

* Also, on examining the Cultural Points table, it is found that Cemetery was included as a row category. Cultural Points have been considered as having positive influence on future growth. But a cemetery would not have an entirely positive influence on urban growth. Furthermore, parts of UVA were used as cultural points. The university was also used as a major employer. Thus, there has been some overlapping. This results in disproportionate values for some cells.

But this study is an illustration more of a proactive planning approach, than an accurate projection of urban growth for an area. And even though limited in its effectiveness, any attempt to administer planning remedies would have to include some such non-arbitrary problem-solving technique.”

The National Planning Conference starts this week in Chicago at the Hyatt Regency. The theme of APA 2013 is “Plan Big”, and focuses on creating a big shift in planning via big opportunities and big projects to better plan for our future (See related questions about Big Poverty on page 7). While that theme may seem at odds with the current sequestration and other economic cuts, it is a call to get bold with the planning challenges of our times. You can follow the conference on Twitter via the hashtag #apa2013. You can also follow our division’s activities at that conference via the hashtag #apa2013tech.

The division’s Business Meeting and Facilitated Discussion will be held on Sunday at 10.30 am – 11.45 am in the Riverside Exhibition Center. The agenda for our Business Meeting includes sharing of our last Performance Report, adoption of our Bylaws, and transition to the new leadership. Our Facilitated Discussion will focus on broadband infrastructure, policy and sustainability. We have also tentatively planned a social event on Monday at 5 pm. Don’t forget to check the roster on page 4 for the other technology-related events being held at the conference – I will be discussing mapping mashups on Monday at 2:30 pm – 3:45 pm.

Finally, I want to congratulate those who got elected and thank others for participating in our elections held last month. Please join me in welcoming the new leadership: Katherine McMahon, incoming Chair; Nader Afzalan (@NaderAfzalan), incoming Vice-Chair; and Karen Quinn Fung (@counti8), incoming Secretary/Treasurer. Kate has some interesting ideas on planning strategies for broadband, and I wish her success in defining the division’s focus on broadband planning activities (See the notice on p.age 3 for a related survey at http://www.surveymonkey.com/s/72GN8KZ).

I also want to thank Corey Proctor and Joni Graves for their contributions of time and talent as the interim Vice-Chair and Secretary/Treasurer respectively, and especially Steve Chiaramonte for a splendid job as the Newsletter Editor.

I hope to meet you in Chicago. As always, you can contact me directly at harshATgisblog.org (@GISblog).

This report provides a description of existing services, both external and in-house, available to APA divisions for hosting and broadcasting webcasts to their members and other interested professionals, and specifically looks at the external Planning Webcast series. In addition, it includes an analysis of options for expanding these services. The report was produced in response to a request from the APA Divisions Council (DC).

So it seems that some companies pay more importance to making the “right” decision than executing the better follow-through. I suspect more often than not, you can make a “wrong” decision “right” in its follow-through. Doing so, along with setting up few arbitrary constraints, lessens the often debilitating and sometimes paralyzing effects of decision-making in an over-abundance of choices, no matter how much you break the process down. And it increases the chances of a “command presence”. Also, it opens up the possibility of more than one decision being right (or wrong, but don’t think about that too much because you’d never know until later anyway), especially since information relevant to decision-making is almost always trickling in. Quite unlike in the STEM academia where often all relevant information can be known via some analytical processes and the “correctness” of decision-making therefore mostly rests on how you apply those processes and put their results together, in actuality you can often never fully know all that is relevant. Therefore, you have to act on incomplete imprecise implicit data points and shifting goal posts or evolving requirements. But by making the follow-through important, you end up taking more shots. And while that decreases your batting average, it increases the probability of more home runs.

Category 4: The award for the ‘Best Paper on Technology in Planning’ goes to Omar J. Peters’ (University at Albany, SUNY) ‘Why-Fi: A Look at Information Technology as a Strategy for Urban Development’ for the outstanding paper on the use of technology in planning.

Our Award Committee comprised of elected members from the Division Leadership, namely Jennifer Evans-Cowley, Amiy Varma and yours truly. Join us at the award distribution ceremony at our Division Business meeting (National Planning Conference) on Sunday, April the 10th (11:45 AM – 1:00 PM) in Beacon G, Sheraton Boston Hotel. Congratulations again to our award winner!

This is a little fresh air for an old post that was collecting cobwebs as a draftee:

During this year’s Planner’s Day on Capitol Hill, I got an opportunity to interview Senator Cardin on changing federal policies that affect planning. This is an excerpt from our interview. The full interview can be found at the Division’s website.

Harsh – What are some of your main expectations from the next federal surface transportation bill?

Senator – We face three fundamental challenges with the new transportation bill –

With bridges failing, congested roadways, and transit systems strained to the limit, we need to make a major new investment in the nation’s transportation infrastructure. According to the US DOT, the average annual cost to maintain both highways and bridges at their current level for the next 20 years could reach $78.8 billion, while it would take approximately $131.7 billion per year to improve the condition of both highways and bridges Those figures don’t include the billions more needed for our transit systems and their needed expansions. We must act to make a major new investment in a system that is under extreme stress.

Our transportation policy needs to be reoriented to the nation’s needs in the new century. We need to better integrate our various modes of transportation for handling the nation’s commercial goods. That includes freight rail, harbors, and highway trucking routes, including their interconnection to air freight facilities. Our current system for moving people to and from their work, schools, and recreation also will need to be fundamentally rethought. That will mean a much greater focus on mass transit, alternative modes of transportation, smart growth, reduction in the number of vehicle miles traveled as a policy goal, and so much more. We need a transportation policy that supports our goal of reducing our dependence on foreign oil and reduces the generation of greenhouse gases. The new surface transportation law will not accomplish all of these changes overnight, but the new bill should put us on a fundamentally different path than we have taken in the past.

We will need to explore new ways to fund our national transportation programs. Our current reliance on a static “gas tax” is already coming up short: $8 billion in the current fiscal year. If we are successful in moving more commuters out of their cars and into buses and subways, we will see those gas tax revenues decline, not increase. If we are successful in encouraging people to live where they work and to telecommute, gas tax receipts will fall even further.

Harsh – Given the bridge tragedy in Minneapolis last year and the subsequent findings of the National Transportation Safety Board, do you support in principle the National Plan for Infrastructure Investment, and also as a way to stimulate our economy in a time of financial uncertainty?

Senator – The collapse of the I-35 Bridge was a tragedy for Minnesota and for the nation. The bridge failure resulted in 13 deaths. The accident has already spurred the nation into action.

There are approximately 600,000 bridges on highways throughout the United States. About 51 percent of bridges are state owned, 47 percent are locally owned, and less than two percent are owned by the Federal government or private entities. National surveys indicate that nearly one-quarter of all these bridges are structurally deficient.

In addition to the funds provided directly for the repair of the I-35 Bridge, the Congress provided $1 billion in special funding to address our structurally deficient bridges. Of the 2,584 bridges along the Maryland State highway system, 411 (16 percent) are classified as functionally obsolete.

The American Society of Civil Engineers, the Nation, and others are calling for major infrastructure investments. I support a sustained effort to rebuild our national infrastructure. Doing so will provide an immediate stimulus to our economy and give us the network we need to restore the health of our commercial sector.

Back in the summer of 2010, as one of the million proud owners of iPhone 4, I noticed a certain setting to switch phone carrier. That setting then portended the change we will see tomorrow. But should you bite the bait? Assuming CDMA and GSM don’t matter, here’s part 1 of my guide:

There is a lot of spin around Apple’s flagship cash cow, or as we have come to know it- the iPhone, which only recently represented about 43% of its overall sales. Not all of the coverage is positive (remember Foxconn?). Apple’s growing pains also include a big lawsuit fight. But for those with out a blind searing faith in Steve Jobs, the genius patriarch, the iPhone may very well be suffocating. If true, could Jobs be repeating his original sin? And if so, should your phone follow his sin to the grave?

iOS works better than Android out-of-the-box. To better understand the genesis of its famed usability and cool minimalism, watch Jobs’ 2005 Stanford Commencement Address. If you decide to switch, be prepared to shell out monies in cool apps and media. From a quick glance, I paid around $750 over 2 years. To Apple. Not AT&T (that averaged around $2,400 for the same time). And remember that MP3s from Amazon, somethings you can’t buy on your iPhone, tend to be less expensive and redownloadable – a big plus for some. And all that precious data would cost even more to put into MobileMe, Apple’s own cloud solution, never mind the naysayers. So more additions to your ever burgeoning monthly bill (Tethering, Personal Hotspot, …).

Apple still disallows Adobe Flash (or Oracle Java) from iOS. It appears to be more a business decision than a technology constraint, designed to control the sprawl of Flash-based gaming mobile websites where you could buy outside of Apple’s walled-garden. How this affects HTML5 gaming websites is still unfolding, but it certainly helps the lagging QuickTime in the meantime. In any case, it goes against the customer’s best interests by taking away her choice to enjoy multimedia content in one of the industry’s most prolific formats. But Apple has you covered with the most commonly used app: the browser. Mobile Safari, hands down is the best mobile browser out there between the platforms that I tested, namely iOS, Android and Windows Mobile. For the GIS pros among you, Joben blogs about GIS apps for the iPhone. You can always find an increasing number at the App Store, like the iGIS.

Jailbreaking Folsom

So you switch and finally get that toy you were waiting for? Why jailbreak it? Jailbreaking the iPhone isn’t worth the effort, even if it is legal. And even if not upgrading to the latest and greatest release (something that iTunes would handle seamlessly for you, but something that you can’t always do with Cydia because Cydia often trots a step behind) is an acceptable risk, ask yourself if your precious data is too important to jailbreak. After all, you could brick your iPhone and quite possibly provide no way for iTunes to restore it. But if your phone data is not critical ahem, then you can add some developer functionalities by jailbreaking and escape the infamous iTunes bloat. Now jailbreaking could also introduce your spanking iOS to new viruses, but if you must, hope over to Cydia. If you need a copy of the old firmware during jailbreak, grab it from here. Once you jailbreak, remember to download a file browser or explorer, like iFunBox or iPhoneBrowser. You may also want to jailbreak if you want to install a phone firewall out of privacy concerns. After all, Apple did confess to collecting GPS data from iOS 3 and iOS 4 daily. Then again, if that is what propels you, why share your payment info with Cydia’s marketplace (just asking)?

Some quick notes on iFunBox or iPhoneBrowser – You can’t watch your uploaded pics or videos, or play your uploaded songs in their native app, even if you upload them to the folders that the iPhone looks under, say //var/mobile/Media/DCIM/100APPLE/. This is because the iPhone, much like the Android, extensively uses SQLite as its Swiss Army database, and all your uploads need to be first registered in the database, say //private/var/mobile/Media/PhotoData/Photos.sqlite which links your IMG_0001.JPG or IMG_0002.MOV. Now there are Cydia apps like iFile that help add your photos, but videos are still no go. But if you are brave enough to try, download the SQLite Manager add-on for Firefox and test your luck.

Category 2: The award for the ‘Best Use of Technology for Public Participation’ goes to Michael Baker Jr.‘s ‘More For 1604 Social Media Program‘ for its good use of technology to enhance public involvement and participation in planning and decision making processes.

Our Award Committee comprised of elected members from the Division Leadership, namely Jennifer Evans-Cowley, Amiy Varma and yours truly. Join us at the award distribution ceremony at our Division Business meeting (National Planning Conference) on Monday, April the 12th (7 AM) in the Hilton New Orleans Trafalgar Room. Congratulations again to all our award winners!

Those investors who are rushing to their brokers for a piece of TeleNav’s IPO (TeleNav GPS Navigator needs extra cash to fight Google Map Navigation, or prep itself for a buyout), note that TeleNav (read LBS) has nothing to do with TeleAtlas of TomTom (read data). Yet.

Wolfram|Alpha is being billed as an Answer Engine for the scientifically-minded, as opposed to a Search Engine: It takes your query, implied or otherwise, that critical step further by selecting from its list of matches, the one objective description, image etc, and lays them out in context. Not that Google never attempts definitiveanswers [chord], but when it does, Wolfram|Alpha [note] handily beats it to it with background information. START, on the other hand, is sometimesembarrassing. Note that it may not know what to do, but it does not give the wrong answer. Yet.

So Wolfram|Alpha dares to do more than say, Google or Yahoo or Microsoft, and impresses despite its alpha status.

There are inherent risks in such an approach in that it hopes our queries are frequently specific enough, which in some cases, will not be because that is how we generally are. There is also that small issue of assigning culpability to its user for a dumb query. But through consistent performance and by avoiding curation, link-fraud etc pitfalls, Wolfram|Alpha has the potential to wean away some of the Google fan-base, notwithstanding Google Squared. And by targeting the scientific community, it has the potential to emerge as a niche Answer Engine despite semantic ambiguity or crowd-sourcing.

It is a coffee-table sized hardware running Windows Vista and allowing collaborative interaction from up to 4-6 participants. The number of hand-gestures it can recognize is obviously higher than that of a standard touch-screen which can typically handle only a single tap and drag, and maybe multi-touch. On the other hand, the Surface Table can recognize multiple taps, imprecise flicks and resizes, and touch-intensity. Actually, much like a TouchSmart, it can even detect movement just above its surface. Simply put, it is like a giant iPhone.

Application

So how does it lend itself to GIS/Planning application development? Well, it is more eye-candy than useful for its cost at this point and appropriate application ideas may not come readily. If you try to recreate a similar collaborative environment with a series of Tablet PCs, TouchSmarts and Windows 7, you might just be successful. Note that it can’t be detached from its base and wall-mounted since it has a projector underneath.

The Surface Table’s biggest strength lies in its enabling a collaborative environment, and therefore, it is more suited towards “playful infotainment”-type applications. If you develop GIS/Planning applications for the Surface Table, note this: It would be a lot of fun, but maybe not a lot useful. And also, it doesn’t carry any browser application (!) so you can’t simply start using your planning mash-up and development would present its own WPF learning curve for the web savvy. For an elegant GUI design, remember that fat shaky fingers need big buttons. In terms of pricing, Microsoft is currently also charging for its SDK (approx. $3K): Not sure of their pricing model, but it doesn’t seem like a smart idea if their goal is to also encourage the Viral Phenomenon. And although, they don’t yet come pre-installed (!), a wireless card and wheels can easily be mounted to turn your Surface Table into a self-contained unit to enhance its portability.

Sync

There are already some creative applications in-use: Soldiers returning from a patrol dump their head gears onto the Surface Table, and its docking corner instantly syncs their captured data with their sync folder- no fumbling there! Special ID tags can “identify” themselves to the Surface Table, but cell phones running Windows Mobile require a download before they can sync. Selected Omni Sheraton hotels and others are currently showcasing Surface Tables.

Technology

So how does it work? Well, conventional technologies detect touch-location by interrupting:
* Infrared
* Optical Field
* Surface Acoustic Wave
This interception happens just above the screen substrata and its grid coordinates are then translated to screen position. Alternatively, you can do a makeover of your current display using Dispersive Signal Technology (DST). DST integrates chemically-strengthened glass onto existing display. It detects bending wave within the glass radiating to the 4 corners where it gets converted to electric signals. This approach also makes it ideal for heavy-duty use to filter out “noise”, say when outdoors or think glass spills and crumbs in a snack-rich community planning meeting. Then there is Proximity Capacitive Resistance (PCR) for touch-across-surface.

Given the niche, pass it along to qualified professionals or contact me with your resume:
“The project objectives are to develop Virtual World applications to study how people acquire, organize and apply information. The ideal candidate will have a strong background in Virtual World development, and a demonstrable interest in Social/Bio Sciences and/or Communication/Media.”

This was how the old URL looked like, http://www.spatiallink.org/gistools/discuss/weblogs/blogs/?title=gisp_and_aicp. Note that there were limitations to permalink, since %year%, %day% or %category% were unknown from the old URL. Fortunately, I had only 2 categories, so this was a cinch.

Although I am still on the fence on GISP given the relative lackluster, what APA has done with AICP‘s CM could give it some shine when it comes to creating a provider ecosystem.

To quickly fill you in: Last year at its Leadership Meetings, APA launched the CM program for AICP. In short, it required professional planners to continuously seek training in order to maintain their certifications, and allowed 3rd-party providers to offer that training.

For SIS, adopting a similar approach would require forsaking a fee-centric approach, letting someone like OGC bite a bigger share and sinking deeper into some sort of GIS accreditation, far beyond ESRI Authorized Training Program, before the “Surveyor Usurp” (see below).

–π

Related:
� The Status of Professional Certification in GIS – Conclusion:
“GIS application areas range from engineering to computer and information sciences, geography, business, logistics, forestry, and many other academic and professional preparation fields. Because GIS professionals come from a wide variety of backgrounds and academic preparation, no one group can claim to represent all approaches and applications within the GIS community. Also, given the volatile nature of the field, and the rapid change currently underway in software development and application deployment, adequate preparation today does not guarantee competency in the future. For these reasons, an overarching program to ensure appropriate professional preparation and competency must be developed by those parties interested in safeguarding the viability of the field and the competency of those claiming professional status.

It is unlikely that voluntary certification can assure competency across the profession if most practitioners choose not to be certified or if employers don’t insist that their employees be certified. Therefore, it is essential that benefits of certification be clearly articulated. By including a wide range of professional organizations within the certification development process, and working to include the interests of all GIS professionals by developing both a reasonable core set of competencies and appropriate specialized evaluations within the certification process, all groups will benefit from certification.”

David L. Cohen, Vice-President, Comcast– ‘…on a “very limited basis” Comcast was delaying traffic in limited areas when there is heavy traffic.'”Don’t let the rhetoric of some of the critics scare you- there is nothing wrong with network management. Every network is managed.”

• Medium Maximization: “A medium, for example, points or money, is a token people receive as the immediate reward of their effort. It has no value in and of itself, but it can be traded for a desired outcome. Experiments demonstrate that, when people are faced with options entailing different outcomes, the presence of a medium can alter what option they choose. This effect occurs because the medium presents an illusion of advantage to an otherwise not so advantageous option, an illusion of certainty to an otherwise uncertain option, or an illusion of linearity to an otherwise concave effort-outcome return relationship. This work has implications for how points influence consumer choice and how money influences human behavior.”

Planning departments, especially those of smaller cities, have long hesitated because of technology, budgetary and other constraints to engage their constituents through web-based mapping tools. Part of the reason is simply an uneasiness with Web 2.0-esque mapping technologies.

Well, these days they have less to worry about. That is, if they don’t mind piggy-backing on corporate giants.

According to the City’s Principal Planner Michael Forbes, AICP, “the planning projects map, run by Google, is an interactive list of all residential, commercial and industrial projects throughout Burbank that are being processed or have been recently approved or denied. Each project icon on the map includes information about the project and a link to its current status.”

A note of caution for the impatient GIS Planner: While nowadays, a mash-up is more than a hack, most public map APIs are still constrained by their ask-coordinates-get-flat-tile design, albeit smart, when it comes to geometry-aware mapping that requires ‘queriable geometry’.
Consequently, despite the established familiarity of mash-ups, the appropriateness of such mash-ups to enterprise GIS for large-scale custom mapping is still debated.

Then there is that question of commercial advertisements on publicly-funded maps. Note that there are ways around it: Google Maps for Enterprise, for one, allows the option to disable location-based advertising for an annual fee. The free Google Maps also requires map and custom data to be publicly-accessible. But as far as the cause of community’s access to information is concerned, it is well-served by such mash-ups.

“What we saw was a young and passionate movement not-so-subtly showcasing their dedication for open-source as a tool by which to challenge corporate, or closed-source, IT monopolies in the geospatial domain.”

I want to underline the ‘showcasing’ part. It is important to not ignore why that is significant for contribution to opensource, which as some would like you to believe is often lacking direction and profit and not the best use of your time. And it can be summarized like so:

As far as the OpenHandset Alliance SDK is concerned, in spite of how Jonathan Schwartz feels about it and the 10 million that Google is giving away in developer prizes, the SDK could become an albatross around Google neck, courtesy Java.

Google appears to also have successfully convinced the opensource Mozilla Foundation to promote its own services above and before other compelling interests. This may be akin to special interest groups’ manoeuvrings on Capitol Hill, and certainly begs the question – did Google push the Foundation to go slow on mobile? Certainly, Minimo with its XUL environment and many extensions could have made for a speedier development cycle.

PS:

* Back in 2005, realizing the potential of WAP, I tested XHTML/WML/WMLscript v HTML/Javascript on Nokia emulators, and wondered how best to balance the 2 different development requirements. After all, you want to get the many more people who own a mobile but not a computer, access your services.

As the Google-backed Open Handset Alliance takes shape, I have been testing dominant WAP browsers on my 2-year old touchscreen PocketPC. This resulting post should narrow down the choices for those who follow:

 Deep Fish by Microsoft appears to be the most promising of the lot. Unfortunately, it is in a strict testing phase and no longer accepting registrations. Until then, you can always make do with Internet Explorer for Mobile.
 Opera, arguably the slimmest desktop browser out there, has a paid version- Opera Mobile for $24. But if you do not have a smartphone and/or do not wish to spend any money, try Opera Mini.
 The Mozilla Foundation has the amusingly named Minimo.

Opera Mobile offers tab-browsing like Minimo, and does a better job at handling pop-ups and javascripts than Internet Explorer. And like Minimo, it offers ‘grab and drag’ navigation thus eliminating scrollbars. Opera Mobile also offers subtle other improvements, like allowing you to change your User Agent- a must-have for those websites that recognize mobile browsers, but remain inexplicably unprepared for them. On the other hand, Minimo features XUL [try this in Firefox – chrome://browser/content/browser.xul] that has impressively found its way into Mozilla Amazon Browseretc, and is the most customizable.

Absent from all these is the Nokia Web Browser– the sometime favorite of opensource mobile development. After all, its early emulators are what helped a lot of programmers/developers gain a handle on mobile development long before Google.

Try this page to compare Ruby‘s and Python‘s language elegance side-by-side. Spoiler Warning: There is a winner!
To get you started:
Ruby – string.method [“String”.reverse or “String”.length]
Python – string[slice] or function(string) [“String”[::-1] or len(“String”)]
–π

I was a little surprised to find MapServer listed on Nessus– the network vulnerability scanner website chugging along on Apache/PHP: Its mention points to greater usage than earlier anticipated. So if even AGG– its Google-esque 5.0 rendering backend is not enough, here‘s another reason for –4.10.3 users to upgrade:

The installed version of MapServer is vulnerable to multiple cross-site scripting vulnerabilities and to a buffer overflow vulnerability. To exploit those flaws an attacker needs to send specially crafted requests to the mapserv CGI.

By exploiting the buffer overflow vulnerability an attacker would be able to execute code on the remote host with the privileges of the web server.

Solution:
Upgrade to MapServer 4.10.3.

Notice how their solutions are always short and sweet. Savvy programmers/developers would know of a couple of other ways to fail such automatic scanning.

On Nessus, MapServer shares the company of the spatial heavy-weight: Google Earth– ‘heap overflow in the KML engine [FreeBSD]‘. Given Nessus’s reputation in the enterprise class, ESRI’s ArcGIS Server and ArcIMS are both conspicuous by their absence- impossibly secure? less likely; less widespread and not sufficient to warrant a mention, atleast in the enterprise community? quite possible.

Choices:
 [a] iPhone […since the buzz is about it- the Paris Hilton of the technorati]
 [b] Paris Hilton […since the buzz is about her- the iPhone of the glitterati]
 [c] Geographer […since ESRI Press said so]
 [d] Programmer/Developer

Answer: If you answered [c], you have spent a lot of time around ESRI-championed web maps with 8 direction tags, a dogged insistence on not exploiting browser cache and a ridiculous north arrow on every map- never mind that so far no one has turned a browser upside down.

Related:
 [my comment]
The Coming Internet Traffic Jam: “…argument on government legislation. It is a false argument that some proponents of non-neutrality wish to spread. Surely, in this age of war-profiteers turning in record-breaking quarters, loose monopolies of mergers and bundles, debatable price gouging etc, it is a little naive to want to believe that all the companies involved will tow some good line on the other side of short-term profits for the greater common good.

If a government legislation has caused long-term damage in the past, the legislation must be refined or redone and the legislators should be unelected, not have the people’s say through ‘smart legislation’ be silenced.”
[/my comment]
 Making Public Policy: A Nutshell
 Wanted: Proactive Policies

This GCN article titled ‘Geospatial and the elite: Old-school geographic information systems still dig deep on mapping and analyses’ points to a tortuous debate within the traditional GIS industry, and the new industry push to remodel itself as solely an “enterprise class” industry while it continues to loose ground to an increasing domestication or democratization of GIS services.

When I look back to why I chose UVA over UPenn, the cost of living at Charlottesville v. Philadelphia, not Public Ivy v. Ivy League, proved to be the determining factor given finances. Although Charlottesville’s small-town vibe didn’t reconcile well with the “urban” in Urban Planning, and UVA did not play to my love of physical design (focusing more on the sociological aspects of planning that I, well, now believe to be closer to the core principles of planning), it was an enriching ride.

So, as some of you may be deciding on which offer letter to accept this fall, here is a little advice – focus on the one you really want and everything else might just fall in place.

Good luck!

In the Planning Studio inside the Architecture School at UVA

In the Planning Studio inside the Architecture School at UPenn

PS: Compared to UPenn, UVA has smaller graduate programs and endowments. And it feeds the Washington DC metro’s job market. UPenn, on the other hand, has a stronger focus on spatial analytics and feeds the New York metropolitan region. So spare a thought to where you would like to spend, or at least start, your professional career. A note for foreign students – UVA has a good number of, for lack of a better word, “southern aristocracy” flocking to its classes, while UPenn has a larger international student population. So stay north of the Mason-Dixon line, if you have a choice.

 ‘Capacity for regional innovation is often driven by industry ‘clusters”.
 ‘Clusters also significantly enhance the ability of regional economies to build prosperity’.

First, some quick background:

Clusters– industry and region, have been defined as ‘broad networks of companies, suppliers, service providers, institutions and organizations in related industries that, together, bring new products or services to a market’. A cluster-based approach provides an effective planning tool for economic development in the rural countryside. Graphically, I can summarize a rural cluster like so:

The research’s findings, lessons, conclusions, recommendations and directions that I found relevant are:

 ‘Labeling a region around a single cluster or economic activity is too simplistic due to considerable co-location of clusters’
 ‘Clusters most strongly associated with higher levels of economic performance are business and financial services; IT and Telecom; and printing and publishing’.
 ‘Human capital, as measured by educational attainment, is the primary factor related to differences in income growth among counties’.

This research also underlines the importance of spatial technologies as follows:

 ‘Much of the analysis of rural America has been overly simplistic. GIS tools and advanced spatial analyses are not commonly used. It is important that greater use of these powerful approaches be applied to a wide range of issues facing rural America’.
 ‘Mapping is particularly helpful to illustrate and communicate data on clusters’.

Anyway, as I see it as a Planner, an uneasy socio-cultural issue remains unenlightened, and that is…

When you take rural America or for that matter rural Anywhere, and strip it of all its social stereotypes and negatives, you are left with something or end up attracting something that is far from rural- something that will jump, skip and run to the New Yorks of our world in time.

Rural Anything does not clamor for riches; it does not yearn for the hustle-and-bustle of urban life, or for its smog-filled jam-packed commute traffic, or for that neck-breaking workday; it is not awed by the many skyscrapers of the City on whom it conveniently blames all social ills; none of the multi-cultural nightlife or rebellious ways.

Rural Anything simply desires simplicity- a dog yawning in the backyard farm; a winding trail to work; free parking; quiet and quaint neighborhoods topped by the clichéd church tucked away inside the folds of its countryside; fishing expeditions on weekends; just yearning to stretch on a summery afternoon without having to worry about city-like pollutions and crimes; content only to drift and conform to its tightly-knit value-system.

It is a different “make” of people.

How then do you convince it to join the rat-race?

–π

PS:
 As I see it, Relative Rurality- a measure used in this research, helps answer the age-old question: How far would the dollar go? Roughly, the higher the Relative Rurality, the further the dollar would go

This week I had the opportunity to listen to the Google Guys. Having earlier missed a similar opportunity for Jack Dangermond due to schedule conflicts, I made sure I was present at this seminar.

On display were the GE Enterprise solutions- Fusion, Server and Enterprise Client. With GE Enterprise, you can sign into multiple servers, grab the most accurate data from each and roll everything into one seamless experience. You may even squeeze your private globe onto a pocket-sized device and strut it out on a field. For a private domain, GE Enterprise can scale upto a healthy 250 concurrent users, or a little less than those supported by a default PostgreSQL 8.X on Windows.

One astounding statistic quoted was the vast number of users GE has been able to accumulate over its short life- approximately 200 million; reportedly many more than those by Google Maps, with nearly 80% for casual uses. And a surprising number, or so we are told, falls in the 45+ age group.

Approximations aside, here’s my take:

When you try to fathom the 200 million number, you are reminded yet again how ESRI, Intergraph, MapInfo, Autodesk et al, poorly missed the globe software bandwagon. And the traditional SIS companies still do not have a clear winner when it comes to 3D buildings and surface textures, despite counting 3DS Max and Maya. All that information is what users now expect from any cutting-edge globe software.

From the looks of it and the high-end price tag of over $100,000, Google has smelled blood- the fat inside some governments; ESRI and Intergraph can attest to that. If Google succeeds in this aggressive push, the traditional SIS companies will cede further into the background on data visualization; they are anyway planted firmly in the backseat with regards to a lot of casual uses.

So when you combine this push with GE user groups, the KMLoffer to OGC, KML-based searches and other enterprise solutions, then you can see why some traditions may be feeling nervous. Add to that the general perception about Google’s speed-of-innovation- ‘when you use a Google product, Google would innovate faster than the traditional SIS companies to support it’.

As I see it, that growing perception should be the biggest reason for the traditional industry’s nervousness.

These steps slowly push one other software- ESRI’s ArcGlobe, part of the ArcGIS 3D Analyst extension, further away from all that is important. ArcGlobe was useful in that it eventually led to E2, but ESRI had much bigger plans- it was promoted to become widely adopted for 3D data mapping and visualization.

Then Google came along, and ArcGlobe and all the shabby flyby animations and painstaking multipatches in ArcScene, also part of 3D Analyst, suddenly became embarrassing.

That leads me to my prediction of the week: all this will force ESRI to either lower the inflation-adjusted cost of its pricey 3D Analyst- currently marked at $2500, or absorb some of it into E2 or the desktop. Note that Google Earth Pro today costs a fraction at $400.

 Both show comparable spatial data displays and memory usages. I am pleasantly surprised by how consenting NASA of World Wind fame, has been to all such uses, given the murky legal waters of the future when others start using this precedent to demand equal treatment.

ESRI ArcGIS Explorer: Adding content

Being true to the misplaced compulsions of most commercial companies, ESRI only lets you export your layers in E2’s markup language [*.nmf]. However, to piggy-back on the growing user community around GE and because ESRI has no current alternative to Google SketchUp, E2 allows you to import *.kml and *.kmz files. GE, on the other hand, also imports *.gpz and *.loc GPS files in its commerical flavor.

So what is the bottom-line: GE is better suited for consumers of spatial data, while E2 is targeted more at the creators and editors. And how close does E2 come to following the “if you are late, you better be better” mantra? Not quite, but then again, it is just a beta.

Now the waiting game begins for arguably the most innovative internet company in recent times, notwithstanding the acquired nature of GE and SketchUp- Google, to hit back after losing ground to Yahoo Maps– better driving directions planning, and Microsoft Virtual Earth– ability to add and save shapes, and browser-based GE-esque 3D and street level views.

–π

PS:

I wonder how the good folks at Arc2Earth and Shape2Earth would maintain their rates of innovation in response?

As the Secretary/Treasurer of the Technology Division of APA, I recently had the opportunity to interview Ric Stephens, our Immediate Past Editor:

Harsh: So what got you into planning and publishing/editing?
Ric: I worked as a cartographer/German language translator for USAID during college and was hired by a civil engineering firm to prepare maps during summer break.

After school, the firm offered me a job in their planning department and …voila! There are still some plat maps on file from the late 70s with elaborate compass roses for north arrows. I began helping with a local APA section newsletter out of curiosity. A quarter of a century and thousands of newsletters later, I am still interested in desktop publishing.

Harsh: Any favorite planning story that you edited?
Ric: There are three unique stories-

Ric Stephens at the Street of Dreams

For several years, I organized the ‘Dark and Stormy Planning Prose Contest’ to collect and share humorous planning stories. One of my favorites is the 2002 Winner, ‘Zone Noir’ by Michael Young who merged the feel of a 50s detective novel with current planning issues. It’s hard to imagine, but Dr. Seuss wrote a humorous poem on regulating signage for the city of La Jolla, California!

Lastly, while living in California, I received ‘The Story of Sexton Mountain Meadows‘. It revolves around the continuous removal of the ‘t’ from ‘Sexton’. I now live a few miles from this very street in Beaverton, Oregon and am a Planning Commissioner for the City. I found the listed author, but he denies writing the story and referred me to a blog author who remembers the incident, but also denies writing the story. The mystery continues to this day.

I am still collecting stories and if you have a ‘hearing from hell’, ‘purple planning prose’ or other contributions, please email a copy to ric@alphacommunity.com.

Harsh: Any thoughts on the New Media?
Ric: We are far from reaching a paperless office environment, but we are clearly moving towards digital information and communication technologies.

For planning in particular, it is an exciting time to expand GIS with numerous databases including satellite imagery. The REAL CORP 007 event will showcase some of these outstanding IT innovations. Our firm, Alpha Community Development, is developing software to link our projects with these databases. We are also developing project-specific websites and looking for new ways to provide online project management.

Harsh: Any thoughts on increasing readership for the Technology Division?
Ric: InfoTEXT contributors have provided outstanding content that is very relevant to practicing planners, agency officials, educators and students. I believe the missing element is visibility.

It would also be helpful for APA to actively promote the Divisions, and for the Divisions to have programs to promote the newsletters to planning departments, governmental agencies, universities and other institutions.

Harsh: And finally, any advice to the new editor[s] of the Technology Division?
Ric: It’s very difficult to find contributors for articles- I’m several weeks late in responding to this interview.

Having a large group of people to help gather material would be ideal. As the newsletter migrates to the web, the publication should probably adapt a monitor-friendly format and be rich in hyperlinks. I enjoyed editing InfoTEXT and am indebted to all who helped make this a memorable experience.

“… And then the strange people of Asia- the Tartars, who are such splendid horsemen; the Arabs, who travel over the deserts upon camels, and at night stop and tell stories to each other; and the Hindoos, who burn their widows and drown their children, thinking these things are pleasing to God; and the Chinese, who eat puppies and rats, and furnish all the world with tea; and the Turks, with their big turbans- what a wonderful thing it is that in one little book we may learn all about these queer [sic] people.

Perhaps I like geography the more for this reason: Uncle Ben has a great many pictures of different countries, with the people who live there; and when I am studying about a country I look over these pictures…”

One of the pleasures of my current job is the annual opportunity to interact with professionals from around the world, thanks to the International Visitor Leadership Program. During these interactions, I share with the visiting delegations how regional government works in the Virginias.

I always end my presentation on regional governance and SIS with a quick display of Google Earth when we try to locate the remote places the delegation members come from. As can be deduced from these pictures, the members stand in rapt attention of how one private enterprise gives back to the greater common good.

In a related development, Microsoft continues to play catch-up with Google by acquiring GeoTango. However, with its “3D Internet Visualization- a truly open and web services-oriented solution”, GeoTango may just be the partner Microsoft needs for a tango.

An intriguing article that may help those interested in best meeting project expectations in a team-setting. Here is my take on that- for rewards, it is often best if expectations are lower than the actual; for punishments, it is often best if expectations are higher than the actual; so that in both cases, the resulting momentum is kept pointing upward. The old adage of “under-promise over-deliver” follows along the same line.

–π

“… The probe, called the Stroop Test, presents words in block letters in the colors red, blue, green and yellow. The subject has to press a button identifying the color of the letters. The difficulty is that sometimes the word ‘Red’ is colored green. Or the word ‘Yellow’ is colored blue.

For people who are literate, reading is so deeply ingrained that it invariably takes them a little bit longer to override the automatic reading of a word like ‘Red’ and press a button that says green. This is called the Stroop effect.

Sixteen people, half highly hypnotizable and half resistant, went into Dr. Raz‘s lab after having been covertly tested for hypnotizability. The purpose of the study, they were told, was to investigate the effects of suggestion on cognitive performance. After each person underwent a hypnotic induction, Dr. Raz said:

‘Very soon you will be playing a computer game inside a brain scanner. Every time you hear my voice over the intercom, you will immediately realize that meaningless symbols are going to appear in the middle of the screen. They will feel like characters in a foreign language that you do not know, and you will not attempt to attribute any meaning to them.

This gibberish will be printed in one of four ink colors: red, blue, green or yellow. Although you will only attend to color, you will see all the scrambled signs crisply. Your job is to quickly and accurately depress the key that corresponds to the color shown. You can play this game effortlessly. As soon as the scanning noise stops, you will relax back to your regular reading self’…

In highly hypnotizables, when Dr. Raz’s instructions came over the intercom, the Stroop effect was obliterated, he said. The subjects saw English words as gibberish and named colors instantly. But for those who were resistant to hypnosis, the Stroop effect prevailed, rendering them significantly slower in naming the colors.

When the brain scans of the two groups were compared, a distinct pattern appeared. Among the hypnotizables, Dr. Raz said, the visual area of the brain that usually decodes written words did not become active. And a region in the front of the brain that usually detects conflict was similarly dampened.

“… Ten years ago this December, I wrote a memo entitled The Internet Tidal Wave which described how the internet was going to forever change the landscape of computing… Five years ago we focused our strategy on .NET making a huge bet on XML and web services… We will build our strategies around internet services and we will provide a broad set of service APIs and use them in all of our key applications… This coming ‘services wave’ will be very disruptive… This next generation of the internet is being shaped by its ‘grassroots’ adoption and popularization model, and the cost-effective ‘seamless experiences’ delivered through the intentional fusion of services, software and sometimes hardware… I’ve attached a memo from Ray which I feel sure we will look back on as being as critical as The Internet Tidal Wave memo was when it came out…”

“… This isn?t the first time of such great change: we?ve needed to reflect upon our core strategy and direction just about every five years… In 1990, there was actually a question about whether the graphical-user-interface had merit… When we reflected upon our dreams just five years later in 1995, the impetus for our new center of gravity came from the then-nascent web… In 2000, in the waning days of the dot com bubble, we yet again reflected on our strategy and refined our direction… It is now 2005, and the environment has changed yet again- this time around services…

The Landscape:

… In the US, there are more than 100MM broadband users, 190MM mobile phone subscribers, and WiFi networks blanket the urban landscape… We should?ve been leaders with all our web properties in harnessing the potential of AJAX, following our pioneering work in OWA [Outlook Web Access]. We knew search would be important, but through Google?s focus they?ve gained a tremendously strong position. RSS is the internet?s answer to the notification scenarios we?ve discussed and worked on for some time, and is filling a role as ?the UNIX pipe of the internet? as people use it to connect data and systems in unanticipated ways. For all its tremendous innovation and its embracing of HTML and XML, Office is not yet the source of key web data formats- surely not to the level of PDF. While we?ve led with great capabilities in Messenger and Communicator, it was Skype, not us, who made VoIP broadly popular and created a new category. We have long understood the importance of mobile messaging scenarios and have made significant investment in device software, yet only now are we surpassing the Blackberry… The same is true of Apple, which has done an enviable job integrating hardware, software and services into a seamless experience with .Mac, iPod and iTunes, but seems less focused on enabling developers to build substantial products and businesses.

… Only a few years ago I?d have pointed to the Weblog and the Wiki as significant emerging trends; by now they?re mainstream and have moved into the enterprise. Flickr and others have done innovative work around community sharing and tagging based on simple data formats and metadata. GoToMyPC and GoToMeeting are very popular low-end solutions to remote PC access and online meetings… VoIP seems on the verge of exploding- not just in Skype, but also as indicated by things such as the Asterisk soft-PBX. Innovations abound from small developers- from RAD frameworks to lightweight project management services and solutions…

Key Tenets:

… 1. The power of the advertising-supported economic model… 2. The effectiveness of a new delivery and adoption model… 3. The demand for compelling, integrated user experiences that ‘just work’…

… Platform Products and Services Division- a. Base v. Additive Experiences… b. Services Platform… c. Service/Server Synergy… d. Lightweight Development- The rapid growth of application assembly using things such as REST, JavaScript and PHP suggests that many developers gravitate toward very rapid, lightweight ways to create and compose solutions. We have always appreciated the need for lightweight development by power users in the form of products such as Access and SharePoint… e. Responsible Competition…

Business Division- a. Connected Office… Should PowerPoint directly ?broadcast to the web?, or let the audience take notes and respond?… b. Telecom Transformation… c. Rapid Solutions- How can we utilize our extant products and our knowledge of the broad historical adoption of forms-based applications to jump-start an effort that could dramatically surpass offerings from Quickbase to Salesforce.com?…

… Complexity kills… Another simple tool I?ve used involves attracting developers to use common physical workspaces to naturally catalyze ad hoc face-time between those who need to coordinate, rather than relying solely upon meetings and streams of email and document reviews for such interaction…”

–Ray

Related:
* “Building a Better Boom: …The Internet is exciting again, and once again folks are rushing in. In some categories – like search or social networking, for example – there are scores of start-ups vying for pretty much the same market, and it’s certain that, just like last time, most of them will fail.

But regardless of all this déjà vu, we are not in a bubble. Instead we are witnessing the Web’s second coming, and it’s even got a name- ‘Web 2.0’, although exactly what that moniker stands for is the topic of debate in the technology industry. For most it signifies a new way of starting and running companies – with less capital, more focus on the customer and a far more open business model when it comes to working with others. Archetypal Web 2.0 companies include Flickr– a photo sharing site; Bloglines– a blog reading service; and MySpace– a music and social networking site…

Start-ups are leveraging nearly a decade’s worth of work on technologies that are now not only proven, but also free, or very nearly so. Open-source software can now do nearly everything that Oracle, I.B.M. and Microsoft specialized in back in the 90’s. And the cost of computing and bandwidth? You can now lease a platform that can handle millions of customers for less than $500 a month. In the 90’s, such a platform would have run tens of thousands of dollars or more a month…

Or just ask Joe Kraus– a founder of the once high-flying Excite portal. Excite ran through millions in venture capital, then tens of millions of I.P.O. money, before its spectacular demise [Mr. Kraus had left before then]. His latest start-up- JotSpot, is built on open-source software, and cost less than $200,000 to begin.

Mr. Kraus exemplifies the second reason I believe we are not in a bubble: this time, the financiers aren’t driving. Instead, the entrepreneurs and geeks – often one and the same – are. The lessons of Web 1.0 are never far from their minds, and the desire to create something cool that might foster some good in the world is often equally paramount with the desire to make money. The culture of Web 2.0 is, in fact, decidedly missionary – from the communitarian ethos of Craigslist to Google‘s informal motto- ‘don’t be evil’.

Ah, yes, Google. That brings us to the third reason we are not in a bubble: vastly improved search technologies. Recall that the demise of Web 1.0 was predicated in large part on the collapse of the Internet advertising business – people were spending millions buying billboard-like ads that, it turns out, nobody was paying attention to…”

This week Yahoo released its own take on online mapping. Its new service includes both Flash and AJAX APIs coupled with the ability to geocode.

If you think about it, sooner or later this had to happen- developers finally mustering the courage to embrace arty Macromedia Flash for distributing spatial information in a big way, like Geocentric. Actually, Google has been using Flash for a different distribution for quite some time now. But this release by Yahoo and its under-1000 dollar price-tag should help Flash emerge as a more visible player in the online mapping game.

As much as some may cringe to what they see as their tax-dollars being spent on bail out, the often-omitted fact remains that many New Orleanians were not required by the National Flood Insurance Program to purchase flood insurance because they enjoyed the protection of levees. So the federal government through the Corps of Engineers is at least partly responsible for creating a false sense of security by failing to repair levees in a timely manner. Bear in mind that the State of California has been asked by its court to shoulder responsibility for damages from failure of levees for which it is a sponsor. And if we did not cry “welfare state” when the federal government stepped in to ease out the airline industry after 2001, surely we can hush our moans now.

While on this disaster as one watches events unfold, it becomes clear that an infuriating management style marked by a “hands-off” approach that is prone to making excuses for ignored red flags can only get rewarded for ideological and rhetorical reasons rather than merit. And such a management style finds a willing bed-partner in a “let’s-eat-at-steakhouse-since-the-proceeds-go-towards-relief-efforts” empathy-response. In itself, such a response cannot be right all the time for it is primarily detached and “feel good”.

Having observed this breakdown in leadership and with some benefit of experience, I cannot stress enough how planners should restrict their impulse to pen a plan for every problem and how they should also focus on becoming “political actors” for one cannot write a plan that accounts for the failure in carrying out the plan itself.

On another note, many of the residents of New Orleans were not required by the National Flood Insurance Program to purchase flood insurance since they were protected by levees. Although non-discriminatory exceptions can always be made, this further complicates relief efforts as it currently limits the amount of disaster assistance available through certain agencies.

On the eve of the launch of Virtual Earth, as Microsoft plays catch-up with Google‘s high-rate of innovation, here’s a transcript of some tete-a-tete:

[Sometime before 2000]Bill Gates: Now that we are in the email business with Hotmail, we need to think of ways to fatten the bottom-line.Steve Ballmer: Online marketing is the way to go Bill! Let’s just create ahem ahem unnecessary page-views when the user logs-in and put as many graphic-intensive ads on each one of them as possible.
Bill Gates: …something like that SNL skit about advertisements on MSNBC flooding the screen and blocking the anchor’s face?!
Steve Ballmer: …hehehe, something like that! Hey, it’s a free service- the user might as well pay for it through ad views. You’ve got to market these goodies aggressively!
Bill Gates: Yeah, the bottom-line is the key!

[Sometime before 2004]Larry Page: We need to get into the email business with a Google mail. The current services aren’t up to par.Sergey Brin: Yeah, but given our relative size we must offer something that is significantly superior to what the market currently offers to make any reasonable in-roads.
Larry Page: OK, let’s start with a clean slate- how do we offer a better email service?
Sergey Brin: It’s all about the user-experience. At the end, if the user likes it, she will come back for more.
Larry Page: So we don’t flood the page with pop-ups and such junk??
Sergey Brin: That’s right! Advertisements should be useful but as unobtrusive as possible.
Larry Page: Agreed, the user-experience is the key!

I have also added this post to this Wiki, in case you want to expound and guide those who follow – The post just helps me ensure the data doesn’t get spammed-out that easily:

Parent document copied with permission from the original white paper at the GIS Technical Center. The objective was to add notes reflecting procedural changes brought about by the integration of CITRIX WISE Tools. The initial notes were created during a 2005 DFIRM Production.

INTRODUCTION

In August 2003, the GIS Technical Center (WVGISTC) became a Cooperating Technical Partner with the Federal Emergency Management Agency. Our mission, to create digital flood themes from paper Flood Insurance Rate Map (FIRM) and Floodway Boundary and Floodway Map (FBFM) panels and to deliver the data in specified formats and with appropriate documentation. FEMA prepares Mapping Activity Statements (MAS) that outline the scope of work and deliverables for each county-based project. Final products are primarily seamless, countywide geospatial data files in the ESRI shapefile format, along with associated metadata.

According to FEMA (Michael Craghan, pers. comm.), the final vector products will have the following qualities:

1. A seamless county-wide dataset, with no gaps or overlaps
2. The lines and polygons end up in their real-world locations
3. There is no scale distortion (i.e. spatial relationships are maintained; if paper map is 1”=500’, digital version should be too).

The current Mapping Activity Statement for conversion of Jefferson and Berkeley counties specifies these deliverables:

1. Written certification that the digital base data meet the minimum standards and specifications.
2. DFIRM database and mapping files, prepared in accordance with the requirements in Guidelines and Specifications for Flood Hazard Mapping Partners (see references for citation); (S_ Base_Index, S_Fld_Haz_Ar, S_BFE, S_XS, S_FIRM_Pan).
3. Metadata files describing the DFIRM data, including all required information shown in Guidelines and Specifications for Flood Hazard Mapping Partners.
4. Printed work maps showing the 1- and 0.2-percent-annual-chance floodplain boundary delineations, regulatory floodway boundary delineations, cross sections, and BFEs at a scale of 1:100,000 or larger.
5. A Summary Report that describes and provides the results of all automated or manual QA/QC review steps taken during the preparation of the DFIRM.
6. An ESRI shape file showing points where mapping problems are discovered during the digitizing process.

The following sections describe the procedures we follow to (1) prepare the base material for digitizing, (2) digitize features, (3) perform quality control, and (4) prepare final files using ESRI Arcmap 8.x software. This document assumes the user is skilled with ESRI Arcmap 8.x GIS software and has the ability to use reference materials. For help using ESRI Arcmap consult the help files or ESRI on-line support.

DATA COLLECTION PROCEDURES

Source Material (Source Material Inspection)
In the MAS cost estimation phase it is advantageous to become familiar with the FIRM and FBFM panels that cover the geographic extent of the county. In the back of our FEMA binder, there are 3 CDs with scanned panels for 10 high priority counties. The scanned or paper FIRM and FBFM panels should be visually inspected to check for insets and other format issues that may impact the amount of time it takes to digitize and attribute. At the on-line FEMA Flood Map Store search for FEMA issued flood maps. Follow the prompts for state, county, and community. This is one way to become familiar with the number of panels in a county and also to gather information on the effective date. The effective date on-line may be compared to the effective date on the paper panels to determine if we have the newest source. This is important because FEMA may have done some digital conversion in the counties we are digitizing; in Berkeley County, for instance, 2 of the panels were available in a digital CAD format. We received the CAD files (DLG) and copied the line vectors into our Arcmap project.

Base Layer Compilation
As part of the MAS, a ‘base map’ is obtained for georeferencing the FIRM and FBFM panels in a county. The MAS states: “the base map is to be the USGS digital orthophoto 3.75-minute quarter-quadrangles (DOQQs), or other digital orthophotography that meets FEMA standards.” Currently, we use the DOQQs to georeference the panels; when it becomes available, we will use the Statewide Addressing and Mapping photography. Countywide mosaics of the DOQQs are available either from CDs in our office or from the NRCS geospatial data gateway. Before beginning panel georeferencing, gather all the base map photography to cover the geographic extent of the county. Check DOQQ tiles and the ortho mosaic, if used, for agreement with each other. Also check the individual DOQQ tiles against the quarter quadrangle index to make sure that they are NAD83 and not NAD27. Finally, check to make sure that the spatial properties (coordinate system and projection) are defined for each quarter quad.

Georeferencing
FEMA provides scanned (TIFF) images of the paper FIRMs and FBFMs. Not all counties have separate floodway panels (FBFMs).

You can download county FIRMs and FBFMs from the FEMA Map Store. For Summers and Fayette Counties WV, aerial photographs from the SAMB were reprojected on-the-fly and used as base.

“ArcMap will not project data on-the-fly if the coordinate system for the dataset has not been defined. The coordinate system for any dataset can be defined using ArcCatalog” [ESRI Help].

It is advisable to load the aerials, FIRMs and FBFMs in different Raster Catalogs for quicker refreshes. It is best to start-off with geoferencing the index and then nailing each semi-transparent panel in its approximate location through corner points [“spreading in all the right directions”]. Again, it is best to concentrate around your area of interest, in this case, the floodplain. It is also advisable to adjust the visible scale for the aerials for easier navigation.

Also, try to keep the clipboard empty since on aging systems that may cause incomplete raster refreshes. To avoid related spikes in CPU usage, you may adjust the display settings, page file size and Task Manager priorities accordingly. Also, if you have upgraded to ArcGIS 9.1 minus the patch and are having raster display problems, consult the following ESRI thread.

“In general, if your raster dataset needs to be stretched, scaled, and rotated, use a first-order transformation. If, however, the raster dataset must be bent or curved, use a second- or third-order transformation. Add enough links for the transformation order. You need a minimum of 3 links for a first-order transformation, 6 links for a second-order, and 10 links for a third-order” [ESRI Help].

Priority should be given to georeferencing individual panels over interlocking adjacent panels. Once satisfied with the adjustments and associated RMS Error, you may either update if using first-order transformation, or rectify if using higher-order transformation.

Once the groundwork is done, it takes less than 1/2 an hour per panel on a machine with the following specifications:

MS Windows 2000 SP4 Dell PWS 340 Pentium [4] CPU 1700 MHz 1.05 GB RAM

The steps taken to georeference the scanned FIRMs/FBFMs using Arcmap are:

1. Start an Arcmap project in the desired coordinate system. When using West Virginia DOQQs that will primarily be UTM 83 zone 17 (although Jefferson County was zone 18).
2. Add the DOQQs for the area of interest to the project.
3. Add the scanned TIFF to the project. The first panel to be georeferenced is the most difficult, because locating the correct spot on the base map photographs using the landmarks on the panel can be frustrating without a good reference system. One way to do this is to warp the panel index first—hence giving a rough estimate of panel location on the photographs. Alternatively, after warping one panel, work with adjacent panels to make landmark location easier.
4. Use “fit to display” on the georeferencing toolbar pull-down menu to move the TIFF to the current extent.
5. Use the georeferencing toolbar to create control points on the DOQQs and the scanned TIFF, using roads and other major features appearing on the FIRM.
6. It is recommended that “Auto Adjust” be checked on the georeferencing dropdown and that the layer being georeferenced is partially transparent. As control point links are added the scanned TIFF will be shifted over the DOQQs, making finding and adding additional links easier.
7. As you are adding control points, check the residual values and total RMS value in the link table. The goal is for a total RMS value of 10 or less (units are mapping units, meters). After adding as many control points as possible it is sometimes useful to remove links that have very high residual values to improve the overall RMS value of the warp. Sometimes it is not possible to get an RMS below 10.
8. Concentrate control points around areas with flood features to improve the fit of areas that will be digitized. We recommend adding at least 10 sets of control points, although in some cases we used over 20 sets to improve fit.
9. Record the total RMS value of the transformation for each panel in a spreadsheet for the county.

Vertical Datum Conversion [optional]
The estimate of the basic shape of the earth was inconsistent under the National Geodetic Vertical Datum [NGVD] 1929. This resulted in less accurate vertical data computations. Hence, it was decided to shift to the North American Vertical Datum [NAVD] 1988 that uses more reliable means for this estimation. Vertical Datum is required for DFIRM panels and the D_V_Datum table. Note that Vertical Datum conversion will not result in any change in flood depths.

Begin with 7.5-minute USGS Quadrangles. For Summers and Fayette Counties WV, this data was downloaded from the WV GIS Technical Center. Next buffer your County by 2.5 miles to select all the Quad corners that fall inside the buffer. Then reproject the corner points thus selected to GCS_North_American_1983 and add XY coordinates. Now you have all the latitude/longitude coordinates required for orthometric height-difference computations using the National Geodetic Survey’s VERTCON software. Alternatively, you may use the Corps of Engineers’s CORPSCON software.

In VERTCON, if you have generated an input data file for your latitude/longitude coordinates, you would typically select the ‘Free Format Type 2’ option. Else, you would simply enter individual Station Names and associated latitude/longitude coordinates. VERTCON generates an output data file for use in the following calculations [Sample Worksheet].

Once Conversion Factors for all points have been determined, calculate the Average, Range and Maximum Offset for the Conversion Factors. If the Average is less than 0.1 foot, only a “passive” Vertical Datum conversion may be applied. Typically, when the Maximum Offset is <= 0.25 feet, a single Conversion Factor can be applied. Else, stream-by-stream Conversion Factors need to be applied.

Digitizing and Attributing Flood Features (Arcmap Project and File Specifications)
The UTM NAD83 projection, zone 17 is used for all West Virginia countywide flood mapping projects, with the exception of Jefferson County, which is zone 18. All features are initially collected as lines, although special flood hazard areas (e.g., Zone A, AE) are later converted to polygons. All features are drawn in one line shapefile and are later separated into the separate files required to meet MAS deliverables. For the purposes of drawing the flood feature lines we are using a line shapefile with the following attribute fields: Type (text, 10), Letter (text 2), Elev (long integer, precision 5). A description of the values we use in those fields is given below with each different feature type. In the first round of digitizing the shapefile was named All_Lines.shp, although in the future we may switch to using a county name in combination with employee name. Save edits frequently while digitizing, both by using the save edits button in Arcmap and by making backup copies of the file with Arccatalog.

Snapping
Begin an edit session and set up the snapping environment. Having snapping turned on is important to allow snapping of BFEs to the edges of flood hazard areas and for snapping the flood zone line segments together. We generally usually use a snapping tolerance between 7 and 10 pixels; this is a personal drawing preference and may vary from person to person. Use the appropriate snapping mode for each type of feature, i.e. ‘vertex’ for closing zone boundaries, ‘end’ for snapping arc ends together and ‘edge’ for snapping BFE lines to zone boundaries. Note that having ‘vertex’ snapping on can make it more difficult to accurately place BFE endpoints. The goal is clean intersections and BFEs that are snapped to flood hazard area boundaries.

Feature Collection
We generally draw flood map features in this order: floodway, flood zone, BFE, and cross-sections. Some counties have floodway features on a separate map (FBFM) from the FIRM. When working with two maps, collect floodways and cross sections from the FBFM and collect flood hazard zones, BFEs, and streams and channels from the FIRM maps. When working with a FIRM and a FBFM for a panel, it is recommended that lines are drawn from the FBFM first and the FIRM second. Features are to be seamless across panel boundaries, meaning when the same feature type occurs on both sides of a panel boundary, it should be drawn with no interruption. Adjacent panels digitized by different people should have the endpoints of flood feature lines snapped together in the final line shapefile. Be sure to check panel edges carefully for small flood zone polygons.

Panel Index and Base Index
Collection and attribution of flood features will be discussed in detail below. In addition to the flood features, we also submit 2 polygon index shapefiles to FEMA for each county. One of the shapefiles is called S_FIRM_Pan and is an index of the FIRM panels for a county. It is created by digitizing the lines on the scanned and warped county FIRM index. Only unincorporated areas are included the in the panel index, not the incorporated areas. Secondly, an index of the “base” data for a county is to be provided in a polygon shapefile called S_Base_Index. In our case, the base data is the DOQQs. The S_Base_Index shapefile can be generated by clipping out the appropriate quarter quads from the DOQQ index. As with all other shapefiles we submit, both the S_FIRM_Pan and S_Base_Index shapefiles have a required attribute table format, discussed later in this document.

Flood Feature Symbology and Attributes

Floodways
The floodway is the channel of a river plus any adjacent floodplain areas. Floodways won’t be found on all panels. There are 2 different presentations of floodways on FEMA panels, which vary by county. In some counties, Berkeley for example, floodway symbology is included on the FIRM (Figure1a). Other counties have separate floodway panels (FBFM, Figure 1b) and they must be added as a separate layer for floodway line collection.

In the initial drawing, lines defining the floodway are given the following attributes:

Type: floodway
Letter:
Elev:

Flood Hazard Areas
Flood hazard areas will also be referred to as ‘flood zones’ or ‘zones’ and they identify areas of different levels of flood risk. Flood zones are labeled on the FIRMs with letters; commonly used zone names are A, AE, B, C, X and they are shown on the paper maps with different densities of shading and text labels (Figure 2a). Zones are collected as lines, although later they will be converted to polygons. Digitizing proceeds from the inside out, i.e., collect the innermost zones first (In Figure 2a, the floodway would be collected first, and then AE, then X). Where an outer zone line flows into an interior zone line, they should be snapped (Figure 2c). Each line defining flood zones should be collected only ONCE. In areas where zone boundaries are coincident, only one line is collected (Figure 2c). There are zone division lines (Figure 2c and d, also referred to as gutter lines), which separate “special” flood hazard areas (generally zones A and AE). The zone division lines are thin white strips that are hard to see in the shaded zones. Gutter lines should be considered the border of those particular zones and treated as any zone boundary would be (i.e., collected once, continuous with other zone lines).

In the initial drawing, lines defining the flood hazard areas are given the following attributes:

Type: zone
Letter:
Elev:

Base Flood Elevations
Base Flood Elevation (BFE) is the height of the base (100-year) flood in relation to a specified datum. BFEs are symbolized on the FIRM panels with a wavy line (Figure 3a) but the feature is usually collected as a straight line (Figure 3b) that is snapped to the edge of the flood hazard area. IF there is a significant bend in the BFE as drawn on the panel, then additional points may be added to follow the curve. Ends should always be snapped to the flood hazard area.

In the initial drawing, lines defining the BFEs are given the following attributes:

Type: bfe
Letter:
Elev: numeric elevation value on FIRM (e.g., 405)

Cross Sections
Cross sections (Figure 4a) show the location of floodplain cross sections used for computing base flood elevations. Cross sections are normally collected as a straight line, crossing and exiting the flood hazard area (Figure 4b). It is not necessary to follow bends in the cross section line that occur outside of the flood hazard area, nor is it necessary to extend the line through the hexagons at the end of the line symbol. If there are bends in the cross section within the flood hazard area, place only as many vertices needed to maintain shape. Cross section lines should not be snapped to the flood hazard area lines, and instead should extend beyond them.

In the initial drawing, lines defining the cross sections are given the following attributes:

Channels and Streams
Channels and streams (Figure 5a and 5b) are collected in the flood hazard areas for QC purposes. No snapping is required and the stream or channel line should extend just beyond the flood hazard area when applicable. Streams are collected as single lines and both lines of a channel are collected.

In the initial drawing, lines defining the channels and streams are given the following attributes:

Type: channel or stream, as appropriate
Letter:
Elev:

POST-DRAWING QUALITY CONTROL AND ADJUSTMENTS

Visual QC Of Linework
After all lines are digitized and in a countywide, seamless file, a visual check is done to ensure that all features have been collected. The “Type” field in the line shapefile can be used to categorically symbolize the different feature types for the visual QC. Different colors and line styles can be used to represent separate feature types and the legend symbols can be saved as a layer file to preserve the symbol assignments. Turn on the labels for BFEs (elevation) and xsections (letter) and select a font style and color that allows them to be easily seen and checked in the visual QC process. Each person will probably have a different method of doing a systematic visual inspection. Some suggestions: a grid could be used to scan the linework, drainages can be followed, or the check can be done panel by panel. The important thing is to scan at a level such that all of the panel raster features can be identified and vectors examined. The person doing the QC should have a full understanding of what features are supposed to be collected and the symbology variations (e.g., floodways on FIRMs vs FBFMs). Any missed features should be digitized. This is also a good time to make note of any unusual problems or non-conformities in the scanned panels (e.g., zone type changes at panel or corporate boundary). This is the time to check that features are seamless across panel boundaries; BFEs and cross sections in particular should be checked at panel boundaries because there is no further geometric processing with these lines that will reveal continuity errors.

Spatial Adjustments (otherwise known as “Adjusting To The Real World”)
Post-drawing manipulation of lines to improve “fit” is hard-to-quantify and subjective. As stated in the introduction, FEMA requires the digital data to have a reasonably good fit to the “real world”. The “real world” in our case is the DOQQs. The scanned panels do not warp perfectly and in some areas the digitized lines will not overlay real world features very well. Current adjustment procedures involve these steps:

1. Compile the following layers in Arcmap:
a. DOQQs
b. Line shapefile with county-wide seamless flood features
c. 1:24,000-scale NHD centerline data layer (route.rch, in catalog unit coverages)
d. Problem point file (discussed in the next section)
2. Determine a systematic method for visually scanning the data (similar to that used in the visual QC) and adjust “Type” symbology for easy differentiation.
3. Begin a visual check of the linework, this time concentrating on how well the streams and channels drawn from the flood panels line up with the DOQQ and the NHD data. It is strongly recommended that you do not use the FIRM panels at this point, as they will increase confusion.
4. NHD data are a fairly good guide to where the flood panel waterways “should” be; however they are not perfect. While visually scanning the linework, check that the streams and channels collected from FEMA panels line up fairly well with the NHD data, while also checking to see that NHD data appears to overlay the hydrologic feature on the DOQQ. There is never going to be a perfect fit; the panels streams will wander back and forth over the NHD vectors. What you are looking for is areas of consistent difference that extend for a noticeable distance (again, hard to quantify). In Figure 6a, the blue dashed panel stream channel lines are not aligned with the DOQQ stream channel edges.
5. When areas of consistent difference are found, ALL the linework surrounding the area is shifted at the same time, until the panel stream has a better fit to the real world stream. This is accomplished by first breaking all the continuous flood zone, floodway, and stream lines at about the same point on 2 imaginary lines that run perpendicular to the “flow,” one at each end of the area to be shifted. Then, the cut lines are selected, along with any BFEs or cross sections that are in the area (Figure 6b), and all the selected features are moved until the streams are better aligned (Figure 6c). The adjustment is accomplished mostly with the move tool in Arcmap, although in occasion the rotate tool may be used to improve the fit of the selected lines with the DOQQ.
6. Lastly, snap the dangling ends together and smooth out the curves of the reattached lines by moving or adding vertices (Figure 6d). This is the only time lines should be moved or stretched individually, as it distorts proportions.

Mapping Problem File
One of the required deliverables is a point file indicating areas where certain “problem” situations arise. At the same time as adjustments are being performed, the problem point file can be edited. FEMA defined mapping problems are outlined in the draft Technical Memo, dated October 3, 2003, a copy of which is found in the FEMA project notebook; they have also been listed below for convenience. A point shapefile is created for each county with the following fields: Error_type (text, 10) and Descrip (text, 75).

Error_type Descrip
BFE Base Flood Elevation problem
XSECT Cross-section problem
SFHA-PAN Special Flood Hazard Area changes at map panel edge
SFHA-BDY Special Flood Hazard Area changes at a political boundary
SFHA-STR Special Flood Hazard Area different on each side of a stream
SFHA-OTH Other Special Flood Hazard Area problems
STR-FW Stream outside of floodway
STR-SFHA Stream outside of Special Flood Hazard Area

As of this writing, we have primarily found the STR-SFHA, STR-FW, and SFHA-BDY types of errors. Note: errors should be determined AFTER lines are adjusted in a given area, as the adjustment may correct the problem. Place a point in the shapefile at the location where the problem occurs. In Figure 7 the pink point indicates a location where the stream (orange) is outside of the flood hazard area (blue line).

POLYGON CREATION

The flood hazard zones and floodways must be converted to polygons for final processing. Select all lines with a “Type” of zone or floodway and export to a separate line shapefile. Topological checks will be performed on the line file before polygons are built. Topology work can only be done in Arcmap via the geodatabase model. Import the line shapefile into a geodatabase feature class that is under a feature dataset (must have a feature dataset to create a topology). If you are starting with a geodatabase / feature class, then use Export | Geodatabase to Geodatabase in Arccatalog to transfer the feature class into the dataset.

Add a new topology under the feature dataset. Set the cluster tolerance relatively high (0.1 was used in the first 2 MAS, which corresponds to 10 centimeters on the ground) to reduce the number of small pieces formed. Only the flood hazard zone lines feature class will participate in the topology. The topology rules used are: must not have pseudos, must not have dangles, and must not self-overlap. After creating the topology for the lines, validate it. Bring the validated topology layer into an Arcmap project to view the errors found. Use the topology tools to analyze and correct all errors before proceeding. See the Topology section in the ArcGIS book “Building a Geodatabase” for help.

After validating the topology and fixing all topological errors, convert the lines feature class to a polygon feature class. To do this, right click on the feature dataset in Arccatalog and select ‘new’ and then ‘polygon feature class from lines’. A wizard helps with the conversion; accept the default tolerance.

Once the polygon layer is created, create a new topology for it. Use the default cluster tolerance, which is very small. Only the polygon feature class participates in the topology, and the rules are: must not overlap and must not have gaps. Bring the validated polygon topology into Arcmap as with the line topology. Ideally, there will be no errors in this topology. After checking for and fixing topological errors, another check should be done for sliver polygons. This can be done by viewing the polygon attribute table in Arcmap and sorting the table based on the shape_area attribute field in ascending order. Examine the smallest polygons to be sure they are not slivers.

Next, the polygon flood hazard features need to be attributed. This can be done in the geodatabase, setting up a domain so that attributes can be chosen from a drop down list. Overlay the flood hazard polygon layer with the FIRM/FBFM panels and attribute the polygons. It saves time if the shapefile you are using to add attributes has the same column structure as the required final product (see Table 2). In the future we hope to have template files available for use, so that the required structure will already be in place. We have tried merging with a template file in the geodatabase, but that resulted in features shifting. This process is still being developed.

PREPARATION OF DELIVERABLES

For the final deliverables, the flood features collected in the line shapefile must be processed into separate shapefiles with specified fields. Table 1 gives an overview of the shapefile names and contents. Attribute fields have required field types (e.g., text, number) and sizes; details can be found on the pages of Guidelines & Specifications for Flood Hazard Mapping Partners Appendix L referred to in Table 1. These pages from Appendix L have been printed out and are in the guidelines/technical section of the FEMA project binder. Table 2 provides details on the required fields.

Table 2. Shapefile attribute field requirements
Shapefile Field Name What Goes In It
S_Fld_Haz_Ar (polygon) FLD_AR_ID A unique feature number. Can be copied from FID field. [Text, 11]
FLD_ZONE Flood zone from FIRM. Use values in FLD_ZONE field of Table D_Zone on pg L-452 of Appendix L. [Text 55]
FLOODWAY “FLOODWAY” if polygon is a floodway. Null if not. [Text, 30]
SFHA_TF “T” if any zone beginning with A. “F” for any other zone. True or false. [Text, 1]
SOURCE_CIT 11-digit FIRM panel number that majority of feature is on. If polygon crosses many panels, use downstream panel. [Text, 11]

S_BFE (line) BFE_LN_ID A unique feature number. Can be copied from FID field. [Text, 11]
ELEV Numeric elevation of BFE, from FIRM [Double, Prec. 13, Scale 2]
LEN_UNIT “FEET” in all cases. [Text, 20]
V_DATUM Vertical datum of panel. Listed on panel, and values must come from the V_DATUM field of the D_V_Datum table on page L-444 of Appendix L. [Text, 6]
SOURCE_CIT 11-digit FIRM panel number BFE is on. If on two, list panel with majority. [Text, 11]

S_ Base_Index (polygon) BASE_ID A unique feature number. Can be copied from FID field. [Text, 11]
FILENAME Name of DOQQ or other image file used as base map. [Text, 50]
BASE_DATE Date image was captured. For DOQQs can be found in header file. [Date]
SOURCE_CIT BASE1 or other abbreviation that corresponds to metadata [Text, 11]

From the line shapefile that was used for digitizing, use the Type field to select and export BFEs to a separate shapefile. Either modify the resulting shapefile to match the required format, or, merge the digitized lines with a pre-formatted template. The output merge file can be a geodatabase feature class, which allows for the use of an attribute domain drop-down for the SOURCE_CIT field. Use Arcmap editing tools to assign attributes to the fields shown in the preceding table. BFE lines are submitted in the S_BFE shapefile.

Cross-section Shapefile Creation
From the line shapefile that was used for digitizing, use the Type field to select and export cross-sections to a separate shapefile. Either modify the resulting shapefile to match the required format, or, merge the digitized lines with a pre-formatted template. The output, as with BFE, can be a geodatabase feature class. Attribute domains can be created for the XS_LTR, XS_ LN_TYP, WTR_NM (a list of stream names is available in the county FIS book) and SOURCE_CIT fields. Cross-section lines are submitted in the X_Xs shapefile.

Certification
One of the required deliverables relating to the base map (DOQQs in our case) is a “written certification that the digital data meet the minimum standards and specifications.” A text file with the following statement was created:

“This text file serves as written certification that the base map digital data meet the minimum standards and specifications in Guidelines and Specifications for Flood Hazard Mapping Partners Appendix K. On page K-42 (Section K.4.1.1) of that document it is written “The most common form of raster image map is the digital orthophoto, especially the standard Digital Orthophoto Quadrangle (DOQ) produced by the U.S. Geological Survey.” DOQQ’s were used as the base map for georeferencing scanned paper FIRMs and for visually locating features of interest.

2) Base Layer Compilation/Verification
a) Used a vector quarter quad index certified by WVGISTC to confirm that the USGS Digital Ortho Quarter Quads (DOQQs) were in the UTM NAD83 projection; DOQQS were used for the georegistration base map
b) Checked the spatial integrity of a county-wide ortho mosaic (used as a reference; obtained from the NRCS Geospatial Data Gateway

3) Georegistration of Scanned Panel Source Material
a) Ensured data were correctly referenced to the UTM coordinate system
i) Set Arcmap software data frame projection to UTM NAD83, Zone 17 or 18, as appropriate
ii) Georeferenced scanned panels to real-world coordinates using DOQQs to establish reference links
(1) The mean RMS value for warped panels was 5.63 meters (mapping units). This was the best attainable georeferencing that could be accomplished without stretching features and impacting length relationships
iii) Re-warped portions of scanned panels in areas of poor fit to attain a better visual real-world correlation
b) Checked that the scale of warped raster (.tif) and original paper maps were compatible
i) Plotted georeferenced FIRMS at the same scale as paper maps; conducted manual ruler measurements on paper map in comparison to plotted data to confirm accuracy of feature location and length relationships

4) Digitizing of Flood Features
a) Digitized SFHA, BFE, and cross section features from the georeferenced panels as line feature types
i) SFHAs and Floodways were digitized first; BFEs and Xsections were digitized next and BFEs were snapped to AE zone boundaries (Arcmap snapping tolerance set to 10 pixels)
ii) Streams and channel banks were partially digitized as additional reference features
b) Systematically visually scanned collected vectors and compared them with underlying georeferenced paper flood maps
i) Checked that character of features was maintained
ii) Checked that required features were collected
c) Edgematched features on adjacent panels
i) Checked that features were snapped seamlessly at panel boundaries

5) Spatial Adjustments
a) National Hydrography Dataset (NHD) vector stream centerlines were used to assist in identifying real-world (DOQQ) stream position
b) Proportional piecewise adjustments
i) Adjusted all features (SFHAs, BFEs, cross sections) in small sections of the floodplain when:
(1) the DOQQ stream was not located within the SFHA or
(2) there was a visibly constant difference between location of the DOQQ stream and location of the digitized stream
ii) Attempted to bring the digitized FIRM stream in line with the NHD stream or the stream on the ground, if it was visible on the DOQQ
iii) Used Arcmap editing functions such as line moving and rotating
c) Created a point shapefile to mark location of “mapping problems” as defined in the FEMA technical memo dated October 3, 2003. Examples of problems found:
i) Stream outside of SFHA
ii) Stream outside of floodway
iii) SFHA changes at political boundary

6) Topology
a) Used the ArcGIS geodatabase model and topology rules on SFHA and floodway line features
i) Corrected pseudo-nodes, dangles, and self-overlapping lines
b) Generated polygons from SFHA and floodway line features and used the ArcGIS geodatabase model and topology rules for polygons
i) Confirmed there were no polygon overlaps or gaps
ii) Removed sliver polygons

File Backup
Everything pertaining to the current flood mapping project should be backed up to Vesta. This includes warped panels, line shapefiles, and other reference documents.

A FEMA backup folder is set up at this location:

\\Vesta\FEMA_BkUp

It is visible from the TechCenter network under Vesta and is shared openly. This is where all the files for a MAS in progress should be stored. Use sensible file and folder names to help everyone identify the pieces of the project.

A final backup of everything was kept in this location:

\\Ra\TechCenter\Projects\FEMA

It is recommended that drawing shapefiles be backed up every time they are changed; a file versioning system may be preferable to overwriting the same file each time.

Naming Conventions/Path Structure
FEMA has requested that we name the metadata files in this format:

metadata_countyname.txt

So, for example, the metadata files submitted for Jefferson County were named:

The county name behind the first backslash will change for each countywide project completed and submitted. The Arcshape folder contains the S_Base_Index, S_FIRM_Pan, S_Xs, S_BFE, and S_FLD_Haz_Ar shapefiles, plus the problem shapefile. The Ortho_photos subdirectory contains the DOQQs or other imagery used for the base map. The document subfolder contains the metadata, QA/QC report, and base map certification. I made subfolders for each of those items under the document folder. The RFIRM folder contains all the georeferenced panels.

Now that I would update the DFIRM WIKI more frequently, I added a lock this past weekend to prevent simultaneous editing. And after being hit by abuse through automated comments, basic verification was also added while still allowing relatively hassle-free editing.
At some point, I may submit these improvements back to TipiWiki.

The definition of GIS has evolved from ‘Geographic Information System’ to ‘Geospatial Information System’. It is time now that it takes the next logical step to ‘Spatial Information System’. My earlier post wrestled, well not quiet, for a truer understanding of “GIS” given the advent of non-traditional spatial software. Since then I have been convinced that spatial information is better understood by snapping links that tie, and thus confine, it to geography.

It is therefore disappointing that some professionals continue to look at spatial information from behind the narrow screens of geography. Hopefully, with the entry of non-traditional market forces, this viewpoint will be shaken to the point of abandonment. A truer appreciation of spatial information will require a visual mindset where all spatial components to information are addressed.

Related:
• Front, Side and Top View: Construct two valid isometric projections

“As we become aware of the ethical implications of design, not only with respect to buildings, but in every aspect of human endeavour, they reflect changes in the historical concept of who or what has rights. When you study the history of rights, you begin with the Magna Carta which was about the rights of white, English, noble males. With the Declaration of Independence, rights were expanded to all landowning white males. Nearly a century later, we moved to the emancipation of slaves and during the beginnings of this century, to suffrage, giving the right to women to vote. Then the pace picks up with the Civil Rights Act in 1964, and then in 1973, the Endangered Species Act. For the first time, the right of other species and organisms to exist was recognised. We have essentially “declared” that Homo Sapiens are part of the web of life. Thus, if Thomas Jefferson were with us today, he would be calling for a Declaration of Interdependence which recognises this. This Declaration of Interdependence comes hard on the heels of realising that the world has become vastly complex, both in its workings and in our ability to perceive and comprehend those complexities. In this complicated world, prior modes of domination have essentially lost their ability to maintain control. The sovereign, whether in the form of a king or nation, no longer seems to reign”.

The primary objective of this blog is to mull over industry trends and abstract ideas relevant to the profession, not to regurgitate “operational details”. However, this post may bend that rule.

For those not in the know, a webpage does a lot of behind-the-scene work before it spits-out text on the screen. Here’s a summary of what this webpage does:

 The very first thing it does is send out a header depending on the client-browser. This is recommended when, say, different protocols are used to access the webpage. Note that this step gets initiated only after the Apache Webserver has finished running through its configuration directives. The webpage then marks the start-time for script download and execution. Measuring script download and execution time helps in diagnostics. The webpage also goes down a list of red-flags checking for browser compatibility and permission-settings. Later, it establishes connections with MySQL databases and fetches or defines client and script variables.

 Only then does the layout begin to emerge with some CSS, XHTML and plenty of include files. Care is taken to separate presentation which has been kept to a minimum given the volunteer nature of the website, from content and function, and make it easier to reuse data. To display news feeds, as is the case here, the webpage fetches the feed URL and slices its content into nodes. Sometimes feed URLs do not provide information as desired. For example, this feed URL does not provide a direct hyperlink to its article. Sometimes a feed URL includes an image-path in its description that needs to be dropped. For such cases, scripting languages like PHP offer a wide-array of string-manipulation functions. It is advisable to ensure that the webpage continues to get parsed in a timely manner even if the fetching fails.

 The webpage then wraps-up logging of relevant variables and closes open database connections. If script execution has generated any errors, a summary gets emailed to the administrator. The webpage then spits-out the footer. Its decay into dead text is finally complete […well, unless you use AJAX to monitor client-behavior, as is the case here].

 A quick note on the website maintenance: Given its volunteer nature, it is maintained in small nudges i.e. “minor increments made frequently”, with the emphasis being on function over form.

Related:
 World Wide Web Consortium
 Web Style Guide
 Interesting Website
º http://www.nyas.org/
º http://news.google.com/
º http://www.cancer.gov/
º http://www.nobodyhere.com/
 Website Theme: A lot of experiences came together to start and shape the evolving theme of this website- During the 2002 Colorado/Arizona wildfire disaster, I received an email from the FGDC list serve requesting volunteers for assistance; Then at the 2003 ESRI Annual Conference, I learnt how volunteering is not easy- how the volunteer is not always in control; The omnipresence of mature opensource software not getting enough attention from the general public was a cause for concern; Also, a need was felt to enhance the functionality of my cellphone by connecting it with custom online applications; Additionally, there was a personal need to digest vast amounts of professional information from anywhere.

It is good to know that some professionals concur with the views expressed in my earlier post on the potential for graphic software, like Macromedia Flash. One comment links to an impressive demonstration of this largely untapped potential.

Anyway, two companies whose product GUI I enjoy interfacing with- Adobe and Macromedia, announced their merger earlier this month.

Both their flagship products have become industry-standards in exchanging documents and creating experience-rich applications across platforms. The largely unused spatial potential within Macromedia Flash combined with the increasingly widespread use of Adobe PDF/SVG maps and the sprouting of some exciting derivatives like geoPDF, pstoedit and GSview, make this merger important to how spatial information is exchanged in the near future.

A quick note on the happenings at Google: Yesterday, Google added satellite imagery to its mapping. For speedy displays, 256px*256px JPEG image-tiles scanned at different zoom-levels and each weighing around 30 KB, coupled with some nifty AJAX come handy.

Such a drag-and-drool tiling paradigm, although practised for some time now by website developers to load large images, when applied to internet mapping represents a refreshing out-of-the-box approach. The GET HTTP request method uses a cryptic naming convention to fetch these image-tiles from a preexisting pallette, like so:

Unlike for its regular mapping where Google predictably uses GIFimage-tiles each sized at 128px*128px, for its satellite imagery, Google’s preference for JPEG over another competitive format PNG, is worthy of a second glance: As is common knowledge, JPEG supports millions of colors, but is infamous for its lossy compression. PNG on the other hand, is lossless while supporting millions of colors. However, PNG is currently not supported by all browsers and depending on compression settings, may end-up weighing more.


[10/18/1907] Signalizing the opening of the Marconi Service to the public, and conveying a message of congratulation from Privy Councillor Baron Avebury, formerly Sir John Lubbock

[01/08/1927] Opening new radiophone service; First private call to The New York Times

[10/05/1957] The Naval Research Laboratory announced early today that it had recorded four crossings of the Soviet earth satellite over the United States

[07/21/1969] Astronauts land on plain; Collect rocks; Plant flag

Since the most important technological developments in the time period covered occured in the western world, and since The New York Times can safely be assumed to best mirror these developments, notwithstanding the selective sample included in Page One, I consider these to be our most important technology-related headlines from 1851 to 2002. Although, sometimes technological change can seep in without so much as a loud knock or one bold headline [think Internet].

For those wondering about a headline that may seem conspicuous by its absence, say one that heralds the omnipresent automobile, keep in mind the time period covered. It is widely accepted that the automobile, for example, was invented by France’s Nicolas Cugnot between 1725-1804.

Nutshell: “‘Substituting tax-increase with state lottery’ [Policy – Director/Manager/Planner] as a means to generate additional revenue. Here, it becomes important to first find the ‘percentage of non-gamblers/gamblers/disinterested in the effected constituency’ [Information – Spatial Analyst] because ‘opposition to such a move is more likely to come from non-gamblers’ [Theory – Planner]”.

Such a policy-decision can then be supported by any of the many preferred values for its successful adoption: Religious Value- ‘Scriptures say lottery is a sin, but taxing is a bigger sin. Hence…’; Nerdy Value- ‘People who are weak in probability must pay for it. Hence…’; and so on.

By similarly lopsiding options and obfuscating issues, policy-makers often nudge the intellectually lethargic mass along a preferred course.

I have also added this post to this Wiki, in case you want to expound and guide those who follow – The post just helps me ensure the data doesn’t get spammed-out that easily:

I am getting a ‘jsForm.htm not found’ error? If you are using Internet Explorer, first make sure you have the latest version of that browser. Then remove the Arcims site from your browser favorites, reopen the browser and try again.

How do I import Arcims maps inside ESRI Arcmap? If you have Arcmap 9.x, you can import Arcims maps by connecting to the services of an Arcims server. In Arccatalog 9.x, simply click on ‘GIS Servers’ to add the Arcims server and type-in its URL. Note that this does lead to a noticeable performance drop.

How do I accurately rescale the map when that functionality is provided? True scale depends on monitor resolution, the default being 96 DPI (Dots Per Inch). To make sure that your monitor is configured correctly, for MS Windows, check Display Properties–>Settings–>Advanced–>General. Note that when the map is rescaled to, say 1:12000, 1 inch on the map should represent 12,000 inches. Also note that you can use the Esc button on your keyboard to stop the map from rescaling at any time. Refer to Map Scales for related information.

I click on the print button but nothing happens? Make sure pop-ups are allowed for your Arcims site, then try the Print Tool again.

What is the most effective method to spread the digital wave, especially of the spatial kind, in rural communities and developing countries? The following links offer some fodder, although Korea left the company of developing nations some time ago. A lot of talk has centered around the potential of wireless to bridge the digital chasm between the Knows and the Know-nots in places lacking adequate infrastructure.

Such graphic software, minus the topology and advanced query benefits, function well as basic spatial tools and comfortably serve data over the web with a “fair” amount of interactivity.

Does this make your overpriced IMS overhyped and overblown too?

[my comment]
Macromedia Flash fills this niche quite well as demonstrated [here]. And as the market seems to indicate, it does that [while] satisfying more customers than what an overly fancy GIS would. [This] reminds me of the MapQuest survey when polled customers had expressed great contentment with their level of map detail, whereas cartographers were red with indignation. Akin to using an atomic clock to serve your wake-up call- not needed!
[/my comment]

So is the complexity in Geospatial, better still Spatial, Information System or SIS overblown too? Much of SIS requires common-sense logic arranged linearly. If a person can drive her car in rush-hour traffic as she deciphers vague directions off a schematic map while trying to make sense of rain-washed road signs and maintain a semblance of conversation with her passenger, and still manage to engage the kid in the back-seat [read multi-linear tasking]; she can achieve a sound understanding of spatial databases with little persistence, except for the eye-for-details that comes with practice.

My point: SIS is non-complex and not at the cutting-edge of technological change, and there is ample room for non-traditional spatial software!

PS:
 This rise of non-traditional spatial software challenges the accepted definition of SIS. If you were to follow the modernists approach to design where in the end you remove everything you can without taking away from the essence of your creation and apply it to defining a SIS, you wonder what such a conceptual SIS would be in its simplest stark-naked Spartan form?

As the year-end inches closer, let us look at one significant industry trend:
A potential increase in location-based wireless services [“Where are my kids …no really, WHERE are my kids …and give me that in Lat/Long”]? This could be brought about by a spread of handy ‘location-aware’ productivity tools, such as a GPS-enabled internet-ready Blackberry phone that also functions as a TV. Such tools could tell you when your family members or selected friends move into your vicinity. Based on industry reports, this might be old news in parts of Japan.

The earliest benefit could be in emergency-response which just might be the area most likely to get heavy government funding. Ex: Volunteer Fire Departments being able to access critical layout and hydrant information that they need for machine placement and egress route planning as they respond to a distress call. Or, first-responders being able to retrieve medical history on-the-go. Check out an earlier National Incident Management Systemmemo. Also, take a look at the developments at the WV Statewide Addressing and Mapping Board which plans to implement a statewide Spatial Information System [SIS] using aerial photography etc. The project has been funded in part by Verizon. Its objective is to help emergency-response by integrating mapping with E911, postal and public utility services, and telephone companies. This project was initially started to provide city-style addresses for rural areas so that all areas receive the same level of emergency services. With this broadening of its scope, it could serve as a guide for other states.

Interesting blog on Life With Alacrity about Social Software. For the ignoramus, crudely put Social Software or Groupware or Collaborative Software is software that facilitates group interaction. Often, there is “no overt coordination with the group functioning as an aggregation of interested individuals” rather than as a cohesive unit.

Two intriguing perspectives on the internet from the blog:
 “By ‘augmenting human intellect’ we mean increasing the capability of a man to approach a complex problem situation, to gain comprehension to suit his particular needs, and to derive solutions to problems” [Engelbart. Augmenting Human Intellect: A Conceptual Framework. 1962].
 “To appreciate the importance the new computer-aided communication can have, one must consider the dynamics of ‘critical mass,’ as it applies to cooperation in creative endeavor. Take any problem worthy of the name, and you find only a few people who can contribute effectively to its solution. Those people must be brought into close intellectual partnership so that their ideas can come into contact with one another. But bring these people together physically in one place to form a team, and you have trouble, for the most creative people are often not the best team players, and there are not enough top positions in a single organization to keep them all happy. Let them go their separate ways, and each creates his own empire, large or small, and devotes more time to the role of emperor than to the role of problem solver” [Licklider. The Computer as a Communication Device. 1968].

Interesting web-based map viewer– very snazzy. Now only if the download was quicker.

In related news, Google acquires Keyhole: a company promising a similar 3D interface. Right now, if you google an address, Google provides links to its 2D maps from Yahoo!Maps and MapQuest. Google also provides possible address matches and map links if you type in a name, akin to what Switchboard does.

It would be better if you could click and drag on a map to limit the spatial extent for your search. Although that would clutter the clean interface of Google Local, which by the way, does show maps.