Skip to content

CONTENTS

Solve the towerco IT integration puzzle with Red Cube

We have sat through dozens of product discussions with customers across diverse markets, and if there is one frequently asked question that comes from everyone, it is about system integration.

How does our product, Red Cube, fit within the myriad legacy or future tools in the systems landscape of the customer?

Whether it is a new towerco setting up brand new digital solutions for their site deployments, a towerco being carved out from an MNO and inheriting legacy tools and data, or a well-established infrastructure provider transitioning from older tools, the common concern for all is building a cohesive picture from the multiple jigsaw pieces of complex IT tools.

Both IT and business teams want to know how Tarantula’s product offering will fit into their existing or future enterprise architecture, with minimum impact to their ways of working, without requiring data to be manually replicated across numerous systems, and with data flowing through the tools seamlessly and accurately.

In this blog post, we provide you insights into how to solve this jigsaw puzzle and describe how we are relentless in ensuring that the integration of Red Cube within the tower owners’ tools and platforms is seamless and pain-free.

Designing the enterprise system landscape

When designing the enterprise architecture for a tower owner’s or MNO’s business, enterprise architects need to consider the business requirements of various stakeholders that impact the site lifecycle. They need to then map these requirements to best-in-class software tools. Software integration is done not only to ensure seamless data flow between various applications but also to provide a cross-product experience to the business users making it easier for them to access and manage their telecom site data. To realize this value, the designers and owners of different tools and APIs within the IT ecosystem need to mutually discuss and understand who owns what data and what action is taken where.

This is only feasible at the use-case level. Business organizations need to move away from elementary data mapping and internally define what this “cross-product experience” would look like for them in a detailed user-story format.

When the <DATA UPDATE> happens in <ONE OF THE SYSTEMS>, <USER> wants the <DATA TO FLOW> into <ANOTHER SYSTEM>, to generate <BUSINESS VALUE>.

Even though this is not the only way to do it, it is definitely a great place to start before getting into the complexities of the actual system integration. It also helps you realize value by clarifying what is happening to the operational, financial, and asset data as it moves to and fro across systems, where the original data will be stored, what features to prioritize for the integration, to check if integration templates already exist, and if not, how much of code customization would be needed to move data between the required API endpoints.

From our experience in integrating Red Cube within complex IT landscapes, we recommend the following best practices in designing your tools ecosystem.

Deploying purpose-built tools and avoiding duplication of functionality - WHAT action is taken WHERE?

Best-in-class software applications are built with the aim to solve a few common, use case-focused problems and it is important to deploy such applications that are purpose-built with the domain knowledge built into the product. It can be tempting to try to find a solution that does it all and thus save on the overall investment, but that can also be cost and time-intensive.

Once you have clarity on the tools that need to be deployed, the next step is to avoid any duplication of functionality by defining what action is taken where. Enterprise architects need to consider which system needs to do what and set boundaries in terms of their functionality. Each system comes with a set of native capabilities that should be leveraged to derive the maximum benefit.

Mastering of data - WHO owns WHAT?

Once the appropriate tools are identified within the system landscape, it is equally important to have clarity on what data is stored where and where it is mastered.

Certain data objects may exist in multiple applications, but it is critical to identify the applications which will master these data objects. Duplication of data exponentially increases the possibility of errors, and therefore towercos and MNOs need to have clarity on who owns what data, or in other terms, which tools need to be employed to act as the “data master” for a particular kind of data. The first origin of data within a particular system sets the tone for what happens in this system and how the data flow needs to be controlled.

Building the interface layer – HOW does the data flow work?

Once the right tools have been explored and identified, it becomes vital to ensure that these tools talk to each other seamlessly and this is where robust product integration comes in. Depending on the maturity of the applications, integrations can be set up with basic file-based data transfer or with modern APIs using SOAP or REST technologies. In either case, it is important to analyze and understand the systems that need to be connected to each other, either from point A to B, or via a middle layer, and then define the endpoints of these integrations. Moreover, it is important to define the cadence of integration data. Critical data such as site alarms need to be available in real-time across systems whereas non-critical information such as vendor data can be transmitted at a lower frequency but with more volume.

Building custom integrations can get expensive really quickly therefore, utilizing existing integration templates and tweaking them to fill any gap in features depending on the use case is the way to go. Configuration based on the product capabilities helps businesses achieve their goal while also being easier to integrate and support.

While developing the integration framework, adequate thought must also go into proactively monitoring the data flow between systems, predicting instances that could cause failures, detecting integration issues, if any, at the earliest, notifying the appropriate stakeholders about it, and proactively developing and placing a troubleshooting mechanism in place.

Typical tools deployed by telecom infrastructure owners

Most tower owners and operators rely on 3 major systems for managing the OSS and BSS aspects of their tower site portfolio.

Enterprise Resource Planning (ERP) tools

ERP applications are the primary source of data for setting up key resources such as vendors, customers, contractors, and asset catalogs. During the initial tower build stages, activities such as material requisition, warehouse management, and supply chain management take place in the ERP system. When the site is operational, the financial and accounting module of the ERP tool becomes relevant for managing payments and account receivables.

Remote Monitoring Systems (RMS) or Tower Operating Centers (TOC)

When telecom sites are operational, onsite IoT sensors from the RMS/TOC systems become relevant to capture key site information such as power outages, alarms, energy usage, and, in some cases, site access. These site inputs act as triggers for operations teams who ensure that the site always remains operational.

Site management platforms

Enterprise telecom site management applications, such as Red Cube, sit bang in the middle of the above two tools and act as the Site Master through which all data flow takes place. These applications are the single source of truth for site data, initiating the site creation and updating the live record of the site as it becomes operational and generates revenue. They enable tracking of upgrades, tenancies, contracts and leases, and operational tasks throughout the lifecycle of the site. They exchange data such as vendors, customers, and materials, with the ERP tools, and alarms and incidents with the RMS tools during the site lifecycle.

Using a product such as Red Cube as the master repository of all site-related data provides the single source of truth for intelligent data analysis and management.

Supporting tools

With the growth of technology and the evolution of optimal business processes, more tools are becoming available to provide additional support for specific use cases to tower owners. These include:

  • Mobile apps such as Tarantula’s Field Force app to enable field staff to complete their onsite work orders and conduct asset audits
  • Automated data capture via drones and photogrammetry software systems to generate 3D digital twin models
  •  Document management systems to hold all documents, photos, videos, and drawings with value-added services for optical character recognition and digital signatures
  • Keyless access systems to enable digital locking of sites

Integrating your systems with Red Cube

Tarantula has partnered with some of the leading solution providers making it possible for Red Cube to integrate with their APIs to offer the most comprehensive IT ecosystem to our customers.

If we revisit the three best-practice questions for system integration in the Red Cube context:

1. WHAT action is taken WHERE?

In a typical towerco’s IT ecosystem, Red Cube acts as the “operational system of record” where all the transactions and decisions related to the site and asset are taken. An ERP system, on the other hand, acts as the ‘financial system of record’ which stores and tracks financial information received from Red Cube. An RMS acts as the on-site system of data collection from IoT sensors.

2. WHO owns WHAT?

With all site operations originating in Red Cube, it is the natural Site master. Moreover, contracts and payment data can also be mastered in Red Cube and fed to financial ERP tools for accounting. On the other hand, data such as vendors, bank account information, customer details, and warehouse inventory is typically mastered in the ERP application.

3. HOW does the data flow work?

Red Cube comes with out-of-the-box, standard interfaces or APIs based on the needs of towercos and MNOs, and the data transfer they require between various systems during the course of the telecom site lifecycle. We leverage both REST-based and SOAP-based APIs, while also supporting email-based and flat-file interfaces.

Conclusion

Ultimately, the foundation for successful digitalization is to adopt and scale an integration framework that you have worked towards defining and architecting. Once the integration is in place, towercos are able to automate or streamline their asset and financial data flow, so that the right stakeholders have access to it as and when they want to make intelligent data-driven decisions

If your user story sounds anything like this - where you are looking for a cross-product experience where the data can flow effortlessly for “effective site and asset management” – the Tarantula team has the answers. Our Professional Services team can help you design your system landscape with Red Cube as the self-contained site management software at the core.

Talk to a Tarantula expert today to get a detailed data sheet on the Red Cube APIs.

Contact