Implement a real time news feed using the SPFx, Azure Logic Apps and socket.io

Let’s make the comparison with your favourite social network web site. In the real life, do you refresh your Facebook/LinkedIn/Twitter feed by hitting the « F5 » key every time to get new updates? Obviously not. You are automatically notified in real time when something new happened. Getting back to an intranet portal scenario, why it couldn’t be the same, for instance for a news feed? Here we will implement a real time news feed using the SPFx, Azure Logic Apps and socket.io

Live exampleIn this example, I’ll show you how to implement it relatively simply without a ton of code for an Office 365 intranet portal to create dynamic solutions. Maybe the news feed is not the most useful concrete application of this feature, but think about an alert banner or an urgent annoucement. It makes more sense in these situations. The complete code is available in the SharePoint GitHub web parts repository:

https://github.com/SharePoint/sp-dev-fx-webparts/tree/dev/samples/react-real-time

So how we do achieve that?

Solution architecture

Here is the solution architecture:

 

SharePoint & Azure solution architecture

SharePoint & Azure solution architecture

  1. The client part is implemented through a SPFx Web Part that first connects to the Azure web application via socket.io and subscribes to events.
  2. An Azure Logic App is used to catch new item creation events in the SharePoint list.
  3. When an item is added, the flow sends the item id to an Azure service bus queue using JSON format.
  4. A Node.js Azure web application listens to the queue and checks for new messages every 5 ms.
  5. When a new message is available, the web application emits the data to all subscribers via socket.io.
  6. The Web Part notifies user there are new items available. Items are then retrieved via a REST query (using the PnP JS Core library) according to received ids (one unique query with « OR » conditions) when the user clicks on the notification (for performance considerations).

Client part

SPFx Web Part with ReactJS

The client part (i.e. where notifications are processed) can be implemented using any JavaScript framework. In this sample, I just wanted to use the new SharePoint Framework with the ReactJS library.

SharePoint Framework with the ReactJS library.

SharePoint Framework with the ReactJS library.

 

Few things to notice here:

  • Because I’m not a designer, I just used the Office UI Fabric React components especially the list and callout components for the design part.
  • I used the SPFx Drop 5 version. This precision is important because this version introduces the version 2.0 of TypeScript changing the way where typings are resolved (via @types module). I also used the PnP Js core library version 1.0.5 accordingly (which used TypeScript 2.0 as well) to get data from the SharePoint list.

Real time communication

socket.io

To enable real time communications between the client and the server, I used the socket.io JavaScript library. This library is actually very simple to use and automatically choose the right communication protocol depending of the client specifications (web sockets or long polling):

Client side (socket.io-client module)

import * as io from 'socket.io-client';
...
public componentDidMount(): void {
 
    // Connect to the server
    const socket = io("https://WebAppac61f1b1-ecd7-4bc0-ad7d-619b0df1f757.azurewebsites.net");
 
    // Add the socket io listeners
    socket.on('item:added', (data) => {
      this._onItemAdded(data.customProperties.id);
    });
 
...

 

Server side (socket.io module)

var io = require('socket.io')(server);
...
 
// Broadcast to all connected clients
io.emit('item:added', message );               
 
...

 

Server part

Build your Azure Resource Manager template

When I started this sample, I used the « regular » Azure PowerShell cmdlets to create my resources (web application, service bus, etc.). I rapidly figured out that the biggest problem when you use this approach is that you can’t « organize » your resources in your Azure environment. By « organize », I mean set the same geographical location , put all your resources into one group with a familiar name , etc. When you use regular cmdlets,  most of them don’t have these kind of  parameters and Azure assumes a lot for you autommatically. To fix this, you have either to reorganize manually your ressources afterwards or use other cmdlets to achieve this. Not very effective.

Because I didn’t want to mess up my environnement with unstructured resources, I started to look at Azure Resource Manager instead. ARM is a new way to provide resources within an Azure environment in a declarative manner via a JSON configuration file.

The simplest way to build a template is to use the « Automation script » option in the Azure web portal:

Using the Automation script option in the Azure web portal

Using the Automation script option in the Azure web portal

 

However, with this option, not all parameters can be exported. To help me to build the template, I personaly used the Azure Resource explorer tool to get the right parameters value and the documentation about how to author a template.

Azure Resource explorer tool

Azure Resource explorer tool

With this mechanism, you can now have a clean parametrized resource group structure to facilitate automation and maintenance. Check the « azure-deploy.json » file in the GitHub sample to view the full template format.

Azure Logic App

Logic App acts as an event receiver on the news list to catch item creation events.  Within a logic app, we are able to interact directly with the Azure Service Bus via a dedicated action and pass some parameters of the current item as JSON properties. The flow is pretty simple in our case:

Azure Logic Apps

Azure Logic App

Using a Logic App is a simple way to catch events for a SharePoint list without code but keep in mind there are some limitations:

  • There is no way to catch « item deleted » events for now.
  • Because of the behavior is asynchronous you can’t guarantee the item order in the service bus queue.

For complex scenarios, you may want to use the SharePoint webhooks instead a logic app. However, webhooks are much more difficult to implement for now and require a lot of « plumbing ». To get started with webhooks, you can check this great sample.

Note: it’s possible to use Microsoft Flow as well (which is the « end user » version of Azure Logic Apps). However, be careful with the licence plan especially the flow frequency option.

Azure Service Bus

The service bus is a cloud messaging service for applications. Its goal is to make the link between SharePoint and Azure by storing messages. Within a service, you can create topics or queues. For this sample, I used a queue to store elements (i.e. element ids). By default, there is now push notification when new messages are available in the queue so it means that consumer applications have to check regularly to get them.

Notes:

  • Because you can’t directly see what is the concrete content for a queue or a topic using the Azure portal, I recommend you to use the tool « Service Bus Explorer » for debugging purpose.
  • From a Node.js application, you can use the « azure » npm module to work with the service bus with JavaScript.

Azure Web Application

To notify our client application we need to implement a back end server application. However, within a SharePoint context, the server must meet at least two requirements:

  • HTTPS support: Because SharePoint Online sites are secured by HTTPS by default, you can’t call a HTTP endpoint from your code. Requests will be blocked by your browser.
  • CORS support: Because your site and the Azure web application are not on the same domain, your web application needs to allow cross domain calls (CORS).

In this sample, I used a Node.js Azure Web Application instead a simple cloud service (web or worker role). Actually, in the early version of this sample, I tried to used a web role but setting the CORS was to painful. Also the HTTPS setup was very tricky in a context of a Node.js application. The main advantage using an Azure web application is it has both enabled by default. You don’t have to do extra configration but keep in mind: with an Node.js Azure application (or a web role) you’re not using a « raw » Node.js server « directly ». There is something called IISNODE handling requests for you and doing some voodoo magic in front of your app.

The server code is actually very simple. It just listens to the service bus queue every 5 ms and notify all subscribers when a new message is available:

// Port will be assigned automatically by the Azure Web App (rocess.env.port ). For localhost debugging, we use 8080.
// You can use the built-in Visual Studio Code debugger to test the solution locally.
var port = process.env.port || 8080;
var app = require('express')();
var server = require('http').Server(app);
var io = require('socket.io')(server);
var azure = require('azure');
 
server.listen(port);
 
app.get('/', function (req, res) {
  res.sendfile(__dirname + '/index.html');
});
 
// Service Bus Connection string is retrieved from the app env app settings
var serviceBusService = azure.createServiceBusService(process.env.AZURE_SERVICEBUS_ACCESS_KEY);
 
// Listener function to pull the Azure service bus and see if new messages are available
setInterval(function() {
 
    serviceBusService.receiveQueueMessage('news', function(error, message){
        if(!error) {
 
            // Message received and deleted (default behavior of the service bus)
            console.log(message);
 
            // Broadcast to all connected clients
            io.emit('item:added', message );              
        }
    });
 
}, 5 );

 

Enjoy!

Share this on...

Rate this Post:

Share: