Recent developments in Artificial Intelligence and Machine Learning algorithms have powered network automation. Mobile Network Operators(MNOs) are now using AI-based modules to automate the network using data authorized in their rented/owned areas for network distribution.
With, 5G networks on the verge of disrupting the network paradigm, it needs to be a super-heterogeneous network that can coordinate and organize different types of network base stations such as macro, micro, Femto, pico base stations and manage massive multiple-input multiple-output(MIMO), millimeter-wave or device to device communications.
But, the problem lies in the restrictive access to data for several MNOs. Blockchain-based data sharing can change this scenario and enhance the AI-powered network systems.
Artificial Intelligence is not new to us, but, an earlier version of AI algorithms was limited to certain applications, which was confined to restrictive computational powers of systems.
Then, as AI became more applicable, network operators, started exploring the AI-powered network systems for their network organization and distribution. It began with use of clustering methods for obtaining an optimal partition of the network, then the use of neural network computational algorithms for optimal traffic routing and with advancements into data-driven intelligence, algorithms can now learn by accessing large amounts of data.
Image Source: AI-powered networks
With further developments in Artificial Intelligence and computational capabilities, MNOs now use Convolutional Neural Networks(CNN) and Recurrent Neural Networks(RNN) to create organizational models from large amounts of raw data.
Blockchain-based technologies became attractive for many businesses with the advent of smart contracts. The basic problem with the blockchains earlier on was the verifications, many experts felt the democratization of data in a blockchain-based data sharing was threatening the data security.
Smart contracts brought an end to the doubts on verification issues and data ownerships. Smart contracts are first compiled into a machine-level code and uploaded as a transaction to the blockchain, where a miner will pick it up and then verification is done through voting on the first block, then the transaction is verified on the second block after addition of some data through another client.
Further, the third client can read verified data through the blocks of blockchains. Thus, smart contracts were more democratized and yet verified data over a verification system. Businesses often prefer permissioned smart contracts rather than public smart contracts, which are not as secure as the permissioned smart contracts.
Image Source: System Design
The membership service is responsible for the issuance of memberships, authorizations, and registration of participants for the system.
It holds a root certificate like a master key and issues a second key(Cu) certificate to the registered members. Every time a new member joins the system a “Cu” key is provided as a new certificate. Private keys are used for identity registration and verification of each member.
In our case, the members are different mobile network operators(MNOs). The identification of each MNO needs a specific certificate provided by the MSP layer.
Verifier:
A verifier uses the “Cu” certificate issued through MSP for any user calling API. An API or Application Program Interface works as a medium between the system and the user for the interaction. Specific GUI transforming their app ideas into reality.
Consensus Nodes:
Consensus nodes are responsible for implementing the AI- algorithms, here our blockchain-based data-sharing system integrates with AI algorithms.
They ensure the consistency of the ledger by the implementation of the consensus algorithms. It involves the endorsement of the transactions that have raw data compiled in a byte code for the blockchain.
Further, it also involves the determination of the order of the transactions to be uploaded into the blockchain. For transaction endorsements, smart contracts are used where two supernodes are designated, who is to be endorsed if any transaction has to be uploaded into the blockchain.
A “Hyperledger Fabric” is used for the ordering of the transactions in the system. Here, transactions represent patterns and data use behavior.
Gatekeeper:
Gatekeeper is the bridge between the data layer and the system through a smart contract that can control access to the data layer. It helps maintain the correct flow of data and proper access of the system to the raw data.
BlockChains: share their network infrastructure and data access to reduce expenditure and operational complexity.
But, then there are competition issues and trust issues over several MNOs, which can be reduced through a certificate authority. To assert higher certificate authorization over the shared data, we can use DataChain and BehaviorChain as the permissioned blockchains on the hyperledger fabric.
“Hyperledger Fabric” is actually an open-source ledger with a modular architecture to allow the components such as consensus nodes and MSPs to be used in the system swiftly.
DataChain provides complete control over the access to the data and BehaviorChain is used to record every data. Thus, in combination, these two blockchains provide authority over data, control over data and auditing of huge amounts of data.
Data Permissions are a prime concern in any system that lets access to the raw data. Considering the risk factor and other security parameters, the data permissions can be divided into four different layers.
Note: Users can set their own data permission levels to have complete permission control
A data structure is designed specifically to expedite the process of data sharing through rapid user query submissions and data access. Let us first see how transactions occur in DataChain.
Image Source: Data Structure
Transactions in the DataChain primarily comprises of the following components:
Transactions in the BehaviorChain comprises of the following components:
1. Membership Management:
Membership management is done through mutual identity and registration to avoid malicious activities and secure data access. It is done in the following steps:
A pair of keys with identity information is sent to the network infrastructure.
Once a user is validated, a digital certificate is issued to the new user for identification.
2. Data Collection:
There two basic categories of data, one that is concerned with user privacy and others that are not relevant to the privacy of users. A
The contract verifies the data provider’s identity through a verifier.
Then the contract identifies the raw data pertaining to the user’s privacy, like User ID and others. Once identified, it is encrypted into asymmetric key encryption.
Contract sends data to Gatekeeper, which will store data on cloud and return a data address. On the basis of data address, contract initiates a data transaction as can be seen here:
Function DataGenerate(data, certification)
if Verify(certification) == TRUE:
if data.sensitive data field != NULL:
data.sensitive data field = Encrypt(data.sensitive data field)
data address = Store(data)
initDataChainTx(data, timestamp, data address)
return SUCCESS
else:
return CERTIFICATION ERROR
This is a Pseudo-code for data generation contracts.
3. Data Permission Level:
As we already discussed, there are different data permission levels that a user can define for others to access the data, user owns. For this user can assign data permission using the following code:
Function DataPermissionControl(DataChainTxID, certification, permission level = L0, permissions)
if Verify(certification) == TRUE:
user = Resolve(certification)
if user == DataChainTxID.owner:
initDataChainTx(DataChainTxID, permission level, permissions)
else:
initBehaviorChainTx(DataChainTxID, user, logs)
return SUCCESS
else:
return CERTIFICATION ERROR
4. Data Sharing:
If there is a need for computations over the data without exposing the raw data then a neural agency is composed that applies the algorithm with other verifiers and government involved as a potential participant to avoid malicious data access.
But, if raw data needs to be accessed, then the data sharing takes place in the following way:
The data request is placed along with a digital certificate and a digital signature.
The contract verifies the data request through the verifier.
The authenticity of the request is checked through Gatekeeper and DataChain. If the found access request is secure, data owner is intimated by the request. Data owner passes on data permission and the contract initiates the transaction.
Function DataRequest(DataChainTxID, certification, reasons)
if Verify(certification) == TRUE:
user = Resolve(certification)
data = DataChainTxID.data address->data
if user == data.owner:
return data
if DataChainTxID.permission level == L3 or (DataChainTxID.permission level == L2 and user in DataChainTxID.permissions):
initBehaviorChainTx(DataChainTxID, user, logs)
return data
else:
initBehaviorChainTx(DataChainTxID, user, logs)
Notify(DataChainTxID.owner, reasons)
return CURRENTLY NO PERMISSION
else:
return CERTIFICATION ERROR
4. Data Auditing:
Each data provider (MNO) receives regular data reports. Any malicious activity or misuse of the data is identified through identity authentication in the system.
Users have complete control over the data, whether to withdraw the data on any malicious activity or not?
Conclusion: With 5G networks on the horizon, an organized and optimal AI-powered network can always help MNOs and even businesses cope up with the data demand and data intensity required.
More importantly, the democratization of data among the MNOs through Blockchain data sharing can certainly boost the AI-powered networks!