Last week I wrote about the database that Notic Meet would use, that being NoSQL document storage using MongoDB.
With V1 of the DB design created, it’s time to work on the initial backend services for Notic Meet. Rather than build all of them before building the front end, we will be building in slices, with the front and back ends being built together for each feature.
Before working on those vertical slices, we are currently working on getting the backend to a point where we can add endpoints easily and getting the server project to a place where new models and endpoints can be easily added.
The app uses .NET and C#, with the project being set up with the Blazor WebAssembly App template with .NET Hosted selected.
This template provided a good enough start and created three projects within the solution. Those being:
- Client – Which is used for the frontend
- Server – Home to the API controllers as well as services that those controllers access
- Shared – Which, in our case, will be used to store some frontend logic as well as connect the frontend to the backend so that we don’t need to deal with HTTP requests in the client project
On the front end, I switched to using MudBlazor, a component library I have used for the past year since moving to use C#.
To demonstrate briefly how the communications work from front end to back end, it begins by setting up dependency injection in program.cs in the client project.
builder.Services.AddTransient<IClientUserService, ClientUserService>();
builder.Services.AddTransient<IClientMeetingService, ClientMeetingService>();
This shows the ClientUserService and ClientMeetingService being injected. The former will be the place for all interactions and logic related to users, and the latter will be for meetings. Several other client services will be added, but these two are sufficient to get the backend to a place where it can be used. AddTransient means that a new instance of each class mentioned will be created each time the service is requested.
To use the client services, they get injected to the .razor pages as follows:
@inject IClientUserService ClientUserService
In the code section of each razor page we can now access endpoints such as:
var user = await ClientUserService.UpdateUser(updateUserDto);
In the implementation of ClientUserService it follows this pattern to reach out to the API endpoint in the server:
public Task<LoginResult> UpdateUser(UserDto data) => Post<LoginResult>("user/updateuser", data);
This makes a POST request to the server calling the user/updateuser endpoint and passing in whatever the UserDto requires.
In the UserController file in the server project we set up a HttpPost:
[HttpPost("updateuser")]
public async Task<IActionResult> UpdateUser(UpdateUserDto data)
{
if (string.IsNullOrEmpty(data.UserId))
{
return BadRequest();
}
var result = await _userService.UpdateUser(data);
return Ok(result);
}
In the controller above, we have defined a HttpPost endpoint called updateuser that accepts the UpdateUserDto model. We then check that it contains a userId; if so, we call the user service updateUser method.
So far, we have called the ClientUserService, which calls an endpoint defined in a controller on the server, which then calls the UserService where all of the logic to update the user will be held.
I won’t carry on at this point with how the data is fetched from MongoDb as we only have a prototype of that at the moment, but a brief explanation is as follows:
The UserService accesses the data context which will be something like this:
dataContext.Users.Update(data);
If you wanted to update a meeting instead, that would typically go through the ClientMeetingService > Meeting Controller > MeetingService and then to dataContext.Meetings.Update(data);
By working with the data context, we use abstraction to separate the logic of how data is fetched. Currently, the data context connects to a MongoDataRepository class that interacts directly with our MongoDB. Still, given that we have an interface MongoDataRepository uses, we could easily switch to CosmosDataRepository if we decide to use a different provider. We would need to migrate any data once the web app has gone live, but the option is there should we switch. Abstracting the logic behind the data context means that the UserService, MeetingService, and any other service doesn’t need to worry about how it works and doesn’t care where the data comes from.
We are about 70% done with the data context, about 70% with the MongoDataRepository, and zero per cent with Redis caching. For that part, the MongoDataRepository will likely implement an ICaching interface and have some logic written to decide what needs to be cached, how long, when it needs to be flushed/fetched, etc.
Let us know in the comments if you have any questions.