Data-Driven Fiber Projects

An up-front investment in data can help cities build cost-efficient fiber networks that meet the needs of residents and businesses.

Technology

Fiber infrastructure is now mandatory for any modern municipality looking to improve social and business outcomes for its constituents. However, deploying an FTTx network involves many stakeholders, all of which have unique and complex requirements. Unless municipalities understand all their needs, time and cost blowouts will inevitably result.

To overcome these challenges, municipalities have a critical need to invest in data to support planning processes and guide decisions at each stage of an FTTx build. A fully data-driven approach makes it possible to overcome the issues that inhibit network deployments and ultimately deliver

An asset that generates ROI sooner

A less expensive network

A more transparent, efficient end-to-end process.

Why Build a Fiber-Powered Gig City?

There are many well-documented trends in technology uptake and bandwidth growth that drive overall internet consumption forecasts. However, it is important to define the primary need in the particular community to be connected. The primary need may be to create greater market competition, improve small-business investment, support critical social services or take a step toward the smart-city movement.

In addition, determining core architecture and technology goals is critical, as they impact the timeline, cost, funding options and level of engagement required from the community.

Chattanooga, Tennessee, was the first U.S. city to provide full gigabit speeds to the local community. Since then, economist Bento Lobo of the University of Tennessee has documented substantial social and economic benefits of the city’s FTTH rollout. The fiber network has so far provided $865 million in benefits for Chattanooga and added more than 2,800 jobs. The ancillary benefits have also supported education, health, small business, the arts and municipal services.

The nearby city of Huntsville, Alabama, which is also building out a fiber network, has seen an increase in competition, which ultimately supports more choice for consumers and delivers better overall service. Unlike Chattanooga, Huntsville is retaining ownership of the backbone infrastructure and leases network bandwidth on an open-access basis to providers such as Google Fiber. Other providers, such as AT&T, are building competing fiber networks in Huntsville.

Though no two fiber projects are the same, key learnings can be applied, and a wealth of knowledge is available from those who have been part of previous projects. The list of municipal fiber networks on the Broadband Communities database at www.fiberville.com provides a great starting point to research other projects. Data on projects’ timelines, premises connected, take rates, budgets and geographies can greatly assist in drawing analogies.

Beyond the Pipe Dream

Even though the industry is moving toward fiber and there is a valid business case, from a business and social perspective, to build fiber networks, each city needs to determine if an FTTx build is feasible in its location. To do this, it must start with a solid, data-driven foundation and understand everyone’s data needs.

Table 1: Estimated broadband expenditures in a two-county rural area

There are a number of key roles, each of which is important at different stages of the process. The roles can be filled by internal and external individuals, and individuals commonly have multiple roles. Some of the key roles and their data requirements are shown in Figure 1.

The traditional approach is to consider an FTTx project as a collection of individual silos of fragmented and unrelated tasks. In reality, all stakeholders and all information (in the form of data) relate to one single asset – and this is how an end-to-end, data-driven approach guides such a project.

Consider the story of four people in a dark room with an elephant: One holds the trunk and says it is a snake, one holds the tail and says it is a piece of rope, one holds the foot and says it is a tree, and one holds its tusk and says it is a pipe. In the absence of a shared view of the truth, and without a consistently applied data model, each stakeholder on an FTTx project is in the dark with only a small portion of the needed information.

Standardizing and Validating Data

Different individuals and organizations involved in a multiparty project have differing biases and interpretations of requirements. That’s why shared documents that outline key project specifications are highly beneficial. These documents can be the starting points for project phases all the way from procurement through to construction.

Data should be prescribed in such a way that the formats, contents and version control are well-defined. This enables true collaboration and allows all stakeholders to have a shared, data-driven view of the asset at any point as well as key updates as events happen. Typically this is referred to as a schema, and multiple industry products facilitate this process. When a valid data schema has been created and when all partners have access to the data, keeping everyone honest and on the same page becomes a lot easier.

Often, project managers take a wait-and-see approach and leave data review until late in the process. For a project to truly run smoothly, it is important to have a cost impact analysis of data quality. The Google Fiber checklist provides a great outline of potential data that can and should be analyzed.

There are multiple methods of constructing a design-ready data set. All such methods have different time and cost impacts. Figure 2 shows possible approaches and their relative strengths and weaknesses. Often, using a combination of these approaches minimizes the negative impacts of each. For example, field inspection could be avoided for some data if it is being used for another process or system and is known to be of good quality.

Choosing Premises to Connect

Which premises to connect, how to connect them and the cost of connection all interact in a complex way. A good starting point for determining which premises to connect is to download or internally source potential customer data. When planning a larger project, it’s a good idea to consult public data sources before deciding on a subset of users to connect. Openaddresses.io is a great free, editable source of address locations, and openstreetmap is a free, editable map of the world that includes buildings. Geocoding services, such as Google, and property and regional data from Zillow and the U.S. Census create a rich picture of premises that could be connected.

In addition, data is getting better. For example, a Facebook project is using artificial intelligence to convert satellite imagery into geolocated population data. This data is most useful in geographic information system (GIS) format. Tools such as Google Earth Pro and QGIS are a great free starting point.

Technology Choices

Because technology options continue to evolve daily, taking multiple options into account is important. Depending on the project, technology choices such as loose versus ribbonized fiber, splitter topology, locations and connectorization can have enormous impacts on the speed and cost at which a project can be deployed. Manufacturers are well-placed to advise on the potential of their product ranges and the benefits and impacts they can have. These manufacturers also have clear agendas and motivation to prove the value of their products.

Technology solution consulting is very complex, as each customer and project requirement must be taken into account. Consider both the capex and opex impacts of a number of scenarios. Depending on the scale of the project, standardizing the component set so that volume prices can be negotiated can be worthwhile. Some components are more commoditized than others, and some require more detailed engineering than others. Components have varying lead times and availability. Using a standardized component set can minimize the backup inventory for failures.

Following are some examples:

In brownfields with existing conduit, microfiber could minimize the amount of overbuild required, thus reducing the cost of construction.

In congested aerial deployments, a distributed split network could minimize the size and quantity of cables attached to poles.

If access to skilled technicians is limited, preconnectorized components could minimize the splicing required.

If a city has a low budget and is using its own employees, a splice-intensive solution may be appropriate if the technicians can be trained or have experience in deploying fiber.

A data-driven approach to technology choice enables the use of modern tools to support rigorous analysis. Auto-design and GIS tools are disrupting the marketplace, but their output is only as good as the data fed into them. Keeping the data stream healthy allows fast, iterative scenario analysis that can significantly improve FTTx rollout decisions, removing bottlenecks and supporting efficient rollouts.

Creating Designs

Once the data has been validated, it’s time to start creating designs. These can be indicative or actual outputs (often referred to as “detailed designs” or “constructable designs”). The objective of the design phase is to examine the cost estimates and possible combinations of materials and construction methods. Engaging in a test-design process allows deployers to validate the quality of inputs, processes and outputs.

Figure 3: Fiber network objects and some of the required data relationships

The challenge is to create designs that have all the attribution, relationships and geometry required to communicate what is needed. Figure 3 shows some items that need to be created in a design and their required information. This data is used as the basis for creating customer connection information through to detailed construction prints.

Sharing the initial design with construction partners assists in determining a rate card, or contractual framework for construction. The data can also be validated by using in-field tools or design redlining.

Managing Assets and Operating a Network

Once the input, design data and as-builts are in a consistent form, a deployer can roll out a network in a much more efficient manner. A consistent, transparent data foundation yields benefits not only during rollout but also throughout the entire asset life cycle. During continued upgrades, overbuilds or potentially even a complete change in direction, analysts, construction partners and other key stakeholders become more capable of in-depth visualization and analysis.

The use of data makes a network a living, breathing entity. Bandwidth usage can be related not just to the assets but also to actual customers, enabling rapid fault identification and rectification through a network management system. The network becomes transparent and can be used for advanced applications, such as electricity grid automation.

One of the most important questions potential subscribers ask is “When can I connect?” Different deployers answer this question in a number of ways. However, transparency is always important, as is a streamlined process that links to and enables easy connection fulfillment. A common way of communicating the connection schedule to potential customers is to use a signup web page that enables potential users to see if or when they can be connected.

Conclusion

A solid data foundation across the end-to-end network delivery is reaping results worldwide. Next time you sit down to consider the potential of a fiber project, be sure to have the best possible process and data to ensure the highest level of potential success.