Wednesday, July 9, 2025

RivieraDev 2025 – Day 2 Highlights


πŸ”‘ Keynote: The Cost of Inaction – Quentin Adam & Nicolas Leroux

The second day of RivieraDev opened with an inspiring keynote by Quentin Adam and Nicolas Leroux on a critical theme: “The Cost of Inaction.”

Ken Olsen (1977):
“There is no reason anyone would want a computer in their home.”
A historic underestimation — just like Kodak, who invented the digital camera but didn’t win the market on it, an example of shooting yoursekf in the foot.

Another major example: Blockbuster, which declined a partnership with Netflix — and is now long gone.

πŸ“‰ Activity vs Purpose

  • It’s easy to get stuck in the comfort zone, buried in daily operations.
  • Companies like BlackBerry missed the iPhone revolution by clinging to outdated paradigms.

πŸ’Έ COI vs ROI

  • Don’t confuse Cost of Innovation (COI) with Return on Investment (ROI).
  • The longer you wait to innovate, the higher the Cost of Inaction.
  • Hidden costs include: tech debt, lack of ownership, and stagnation.

Reminder: Not choosing is still a choice — and often the most expensive one.

🌍 Europe vs USA: Two Innovation Philosophies

  • The US embraces failure and fast iteration — risk is encouraged.
  • Europe tends to favor caution, employment preservation, and regulatory stability.
  • But innovation sometimes means letting go — even closing a business if needed.

πŸš€ Key Takeaways

  • Velocity: Tools like ChatGPT, Anthropic Claude, Google Gemini are moving fast.
  • Acceleration: We are living the “law of accelerating returns.”
  • AI Landscape: The French Vilani report from 2018 was once just used to block a door…
  • In software development: visibility matters. Inaction costs you insight, momentum, and relevance.

Personal note: This was a deeply energizing talk — I left the amphitheater with a clear thought: Let’s innovate!


πŸ› ️ Martin Kouba – Building Efficient MCP Servers

This session gave us a practical dive into building MCP (Model Context Protocol) servers — and how surprisingly easy it can be.

🧩 What is MCP?

  • An open protocol (MIT license) for integrating LLMs with tools and external resources
  • Bidirectional communication via JSON-RPC 2.0
  • Transports include stdio, HTTP, and (deprecated) HTTP/SSE

MCP provides:

  • Server features: Tools, prompts, resources
  • Client features: Sampling, roots

⚠️ Missing Pieces in the Ecosystem

  • There’s a spec, but no official foundation supports it yet.
  • Anthropic maintains the GitHub org, but there’s:
    • No PR review process
    • No OSS governance
    • No shared API standard
    • No TCK (Technology Compatibility Kit) to validate implementations

🎯 Quarkus MCP Module – Goals

  • Provide a unified API to implement server-side MCP features
  • Expose programmatic access with @Inject

⚙️ Demo with Goose & Gemini

Martin demonstrated how to integrate an MCP server using Goose, a CLI/chat app, connected to Google’s Gemini LLM. The app was reactive, showcasing the real-time nature of:

  • Tool usage
  • Prompt chaining
  • Progress API — for managing long-running tools or resource operations

This session made it clear: while the protocol is still young and lacks formal governance, MCP is promising for building robust AI-enabled tools and agents.


🌞 Conclusion – Day 2

This second day was full of energy and meaningful discussions — not only during the talks, but also in the lively exchanges between sessions. That’s part of the true magic of attending RivieraDev.

Let’s not forget the delicious food and local specialties from the South of France: socca, pain bagnat, artisanal ice cream, and of course, the ever-present sunshine!

A fantastic end to an inspiring conference packed with brilliant keynotes, thought-provoking talks, and genuine community spirit. Already looking forward to the next edition!



RivieraDEV 2025

RivieraDev 2025 – Let’s Go!

The 2025 edition of RivieraDev is officially underway! Amphitheater 339, with its 500-seat capacity, is completely full. The energy and anticipation in the room set the tone for an exciting lineup of talks and innovations.


🎀 Opening Keynote by Thibaut Giraud (aka Monsieur Phi on YouTube)

Talk: “Do LLMs Dream of Electric Knights?”

  • LLMs are often referred to as “stochastic parrots” — they generate text without understanding the world.
  • Humans also sometimes “parrot” without understanding — language is not always tied to deep comprehension.
  • Criticism like “LLMs just predict tokens” is an oversimplification.

Example: Magnus Carlsen’s unpredictable chess moves are strategic and contextual. Predicting them requires deep reasoning — something LLMs are increasingly capable of.

  • GPT-3.5-turbo-instruct (as of Sept 2023) can play chess with an Elo rating around 1800.
  • LLMs process games encoded in PGN (Portable Game Notation).

Takeaway: Negative results don’t reveal much, it's just teliing you “it failed with this prompt”. Positive outcomes are more telling about LLM potential. It works nows and in the future.


πŸ€– Zineb Bendhiba – MCP in Action

Topic: Integrating AI Agents with Tools via MCP (Model Context Protocol)

  • MCP allows local or remote interactions with models (via HTTP, for example).
  • LLMs are stateless — apps must manage memory and context.
  • LangChain4j supports Java developers building agent-based LLM systems.

πŸ“š Guillaume Laforge – LLM Limitations & RAG

Limitations of LLMs:

  • No real-time awareness (e.g., current date)
  • Can’t access private data
  • Hallucinations
  • Limited token context (even with Gemini's large windows)

What is RAG (Retrieval-Augmented Generation)?

  1. Retrieve: Pull information from external sources (DBs, websites…)
  2. Augment: Add retrieved context to the prompt using templates:
    You must answer the question: {{question}}
    Based on this context: {{context}}
  3. Generate: LLM produces the final response

Advantages: More accurate, less hallucination, explainable via context, and up-to-date.

Implementing RAG:

  • Chunk documents and store in a vector DB
  • Use vector search to retrieve and add context

Advanced Techniques:

  • Chunking: fixed-length, overlapping, sentence-based, or parent-child structures
  • Query compression and routing
  • Agentic RAG for multi-step reasoning
  • Hybrid RAG + long context models

πŸ’‘ Theo Gianella & Julien Sulpis – How Good Are You at Responsive CSS?

  • “Responsive Web Design” coined by Ethan Marcotte (2010)
  • Forget pixel-perfect – focus on flexibility and fluidity
  • Use modern CSS units: ch, vmin, clamp(), dvh, etc.
  • Use container queries instead of relying only on media queries

Reference: CSS Container Query Guide by Ahmad Shadeed


πŸ§ͺ Laurent Dogin – LLM Tools in Nushell

  • Nushell + AI agents GitHub repo
  • Nushell is a cross-platform modern shell
  • Integration with AI uses vector DBs, the ReAct pattern, and TAO prompts (Thoughts, Actions, Observations)
  • Key concerns include managing identity, permissions, and chaining tools — similar to BPM pipelines

🧭 Arnaud Langlade – Example Mapping (Agile)

Problem: In grooming sessions, devs are passive, meetings run long, and delivery is often misaligned.

“It’s not stakeholder knowledge, but developer ignorance that gets deployed to production.” – Alberto Brandolini

Example Mapping (by Matt Wynne):

  • Short (30 min) session with “Three Amigos”: PM, dev, QA
  • Use cards or digital tools like Miro:
    • Yellow: user story
    • Blue: rules
    • Green: examples (BDD style)
    • Pink: open questions

Tips: Involve the whole team, rotate the scribe, split complex rules, and use examples for tests.


πŸ› ️ Dhruv Kumar – Platform Engineering + AI

  • Dev productivity is low — only 11% of time is spent coding!
  • Cloud complexity, security concerns, and tool overload are major factors

Platform Engineering can help:

  • Standardizes SDLC practices
  • Security built-in
  • Supports tool flexibility
  • Improves visibility through DORA metrics
  • AI boosts developer experience with smart triage, alerts, and decision support

🎀 Conclusion – Day 1

That’s a wrap for Day 1 of the RivieraDev conference! From thought-provoking keynotes to hands-on tech sessions, the energy and ideas have been nothing short of inspiring.

Let’s see what tomorrow brings as we head into the second — and final — day of RivieraDev. Stay tuned!



Friday, January 18, 2019

FaaS tutorial 2: Set up Google Cloud Function

Now that we have deployed an app in FaaS tutorial 1: Start with Firebase and prepare the ground, time to spice up 🌢️ our basic app to add some back-end stuff.

What about defining a REST API to add a new record to the database. We'll use HTTP triggered functions. There are different kind of triggers for different use cases, we dig into that in the next post.

Let's start out tutorial, as always step by steps πŸ‘£!

Step 1: init function

For your project to use Google Cloud Functions (GCF), use firebase cli to configure it. Simply run the command:
$ firebase init functions

     ######## #### ########  ######## ########     ###     ######  ########
     ##        ##  ##     ## ##       ##     ##  ##   ##  ##       ##
     ######    ##  ########  ######   ########  #########  ######  ######
     ##        ##  ##    ##  ##       ##     ## ##     ##       ## ##
     ##       #### ##     ## ######## ########  ##     ##  ######  ########

You're about to initialize a Firebase project in this directory:

  /Users/corinne/workspace/test-crud2


=== Project Setup

First, let's associate this project directory with a Firebase project.
You can create multiple project aliases by running firebase use --add, 
but for now we'll just set up a default project.

? Select a default Firebase project for this directory: test-83c1a (test)
i  Using project test-83c1a (test)

=== Functions Setup

A functions directory will be created in your project with a Node.js
package pre-configured. Functions can be deployed with firebase deploy.

? What language would you like to use to write Cloud Functions? JavaScript
? Do you want to use ESLint to catch probable bugs and enforce style? No
✔  Wrote functions/package.json
✔  Wrote functions/index.js
✔  Wrote functions/.gitignore
? Do you want to install dependencies with npm now? Yes
Below the ascii art 🎨 Firebase gets chatty and tells you all about it's doing.
Once you've selected a firebase project (select the one we created in tutorial 1 with the firestore setup), you use default options (JavaScript, no ESLint).

Note: By default, GCF runs on node6, if you want to enable node8, in your /functions/package.json add the following json at root level:
"engines": {
    "node": "8"
  }
You will need node8 for the rest of the tutorial as we use async instead of Promises syntax.
Firebase has created a default package with an initial GCF bootstrap in functions/index.js.

Step 2: HelloWorld

Go to functions/index.js and uncomment the helloWorld function
exports.helloWorld = functions.https.onCall((request, response) => {
 response.send("Hello from Firebase!");
});
This is a basic helloWorld function, we'll use just to get use to deploying functions.

Step 3: deploy

Again, use firebase cli and type the command:
$ firebase deploy --only functions
✔  functions[helloWorld(us-central1)]: Successful update operation. 
✔  Deploy complete!

Please note that it can take up to 30 seconds for your updated functions to propagate.
Project Console: https://console.firebase.google.com/project/test-83c1a/overview
Note you call also deploy just our functions by adding firebase deploy --only functions:myFunctionName.
If you go to the Firebase console and then in the function tab, you will find the URL where your function is available.



Step 3: try it

Since it's an HTTP triggered function, let's trying with curl:
$ curl https://us-central1-test-83c1a.cloudfunctions.net/helloWorld
Hello from Firebase!
You've deployed and tried your first cloud function. πŸŽ‰πŸŽ‰πŸŽ‰
Let's now try to fulfil the same use-case as per tutorial 1: We want an HTTP triggered function than insert 2 fields in a database collection.

Step 4: onRequest function to insert in DB

  • In function/index.js add the function below:
    const admin = require('firebase-admin');
    admin.initializeApp(); // [2]
    
    exports.insertOnRequest = functions.https.onRequest(async (req, res) => {
      const field1 = req.query.field1; // [2] 
      const field2 = req.query.field2;
      const writeResult = await admin.firestore().collection('items').add({field1: field1, field2: field2}); // [3]
      res.json({result: `Message with ID: ${writeResult.id} added.`}); // [4]
    });
    

    • [1]: import the Firebase Admin SDK to access the Firestore database and initialize with default values.
    • [2]: extract data from query param.
    • [3]: add the new message into the Firestore Database using the Firebase Admin SDK.
    • [4]: send back the id of the newly inserted record.
  • Deploy it with firebase deploy --only functions. This will redeploy both functions.
  • Test it by curling:
    $ curl https://us-central1-test-83c1a.cloudfunctions.net/insertOnRequest\?field1\=test1\&field2\=test2
    {"result":"Message with ID: b5Nw8U3wraQhRqJ0vMER added."}
    
Wow! Even better, you've deployed a cloud function that does something πŸŽ‰πŸŽ‰πŸŽ‰

Note thatif your use-case is to call a cloud function from your UI, you can onCall CGF. Some of the boiler plate around security is taken care for you. Let's try to add an onCall function!

Step 5: onCall function to insert in DB

  • In function/index.js add the function below:
    exports.insertOnCall = functions.https.onCall(async (data, context) => {
      console.log(`insertOnCall::Add to database ${JSON.stringify(data)}`);
      const {field1, field2} = data;
      await admin.firestore().collection('items').add({field1, field2});
    });
    
  • Deploy it with firebase deploy --only functions. This will redeploy both functions.
  • Test it in your UI code. In tutorial 1, step5 we defined a Create component in src/component/index.js, let's revisit the onSumit method:
    onSubmit = (e) => {
        e.preventDefault();
        // insert by calling cloud function
        const insertDB = firebase.functions().httpsCallable('insertOnCall'); // [1]
        insertDB(this.state).then((result) => { // [2]
          console.log(`::Result is ${JSON.stringify(result)}`);
          this.setState({
            field1: '',
            field2: '',
          });
          this.props.history.push("/")
        }).catch((error) => {
          console.error("Error adding document: ", error);
        });
      };
    

    In [1] we passed the name of the function to retrieve a reference, we simply call this function in [2] with json object containing all the fields we need.

Where to go from here?

In this tutorial, you've get acquainted with google function in its most simple form: http triggered. To go further into learning how to code GCF, the best way is to look at existing code: the firebase/functions-samples on GitHub is the perfect place to explore.
In next tutorials we'll explore the different use-cases that fit best a cloud function.

Stay tuned!

Thursday, January 17, 2019

FaaS tutorial 1: Start with Firebase and prepare the ground

As being an organiser of RivieraDEV, I was looking for a platform to host our CFP (call for paper). I've bumped into the open source project conference-hall wandering on twitter (the gossip 🐦 bird is useful from time to time).

The app is nicely crafted and could be used free, even better I've learned afterward, there is an hosted version! That's the one I wanted to use but we were missing one key feature: sending email to inform speaker of the deliberations and provide a way for speakers to confirm their venue.

πŸ’‘πŸ’‘πŸ’‘Open Source Project? Let's make the world 🌎 better by contributing...

On a first look, conference-hall is a web app deployed on Google Cloud Platform. The SPA is deployed using Firebase tooling and make use of firestore database. By contributing to the project, I get acquainted to Firebase. Learning something new is cool, sharing it is even better πŸ€— 🀩

Time to start a series of blogs post on the FaaS subject. I'd like to explore Google functions as a service but also go broader and see how it is implemented in open source world.

In this first article, I'll share with your how to get started configuring a project from scratch in Firebase and how to deploy it to get a ground project to introduce cloud functions in the next post. Let's get started step by step...

Step 1️⃣: Initialise firebase project

Go to Firebase console and Create a firebase project, let's name it test

Step 2️⃣: Use Firestore

  • In left-hand side menu select Database tab, then click Create Database. Follow Firestore documentation if in trouble. The Firebase console UI is quite easy to follow. Note Firestore is still beta at the time of writing.
  • Choose Start in test mode then click Enabled button.
You should be forwarded to the Database explorer, you can now add a new collection items as below:

Step 3️⃣: Boostrap app

We use create-react-app to get an initial react app
npx create-react-app test-crud
cd test-crud
npm install --save firebase
and then we've added firebase SDK.

Insert firebase config

  • We use react-script env variable support
  • In env.local copy variable from firebase console
  • In src/firebase/firebase.js, read env variable and initialise
    const config = {
      apiKey: process.env.REACT_APP_API_KEY,
      authDomain: process.env.REACT_APP_AUTH_DOMAIN,
      databaseURL: process.env.REACT_APP_DATABASE_URL,
      projectId: process.env.REACT_APP_PROJECT_ID,
      storageBucket: process.env.REACT_APP_STARAGE_BUCKET,
      messagingSenderId: process.env.REACT_APP_MESSAGING_SENDER_ID,
    };
    firebase.initializeApp(config);
    
    firebase.firestore().settings(settings);
    
This way you keep your secret safe, not committed in your code 🀫🀫🀫

Step 4️⃣: Add routing

npm install --save react-router-dom
mkdir src/components
touch src/components/create.js
And define route in src/index.js
ReactDOM.render(
  <Router>
    <div>
      <Route exact path='/' component={App} />
      <Route path='/create' component={Create} />
    </div>
  </Router>,
  document.getElementById('root')
);
In the root path, we'll display the list of items. In the the Create component we'll define a form component to add new items to the list.

Step 5️⃣: Access Firestore in the app

Let's define the content of Create component in src/component/index.js
class Create extends Component {
  constructor() {
    super();
    this.ref = firebase.firestore().collection('items'); // [1] retrieve items reference
    this.state = {
      field1: '',
      field2: '',
    };
  }
  onChange = (e) => {
    const state = this.state;
    state[e.target.name] = e.target.value;
    this.setState(state);
  };
  onSubmit = (e) => {
    e.preventDefault();
    const { field1, field2 } = this.state;
    this.ref.add({                                     // [2] Insert by using firestore SDK
      field1,
      field2,
    }).then((docRef) => {
      this.setState({
        field1: '',
        field2: '',
      });
      this.props.history.push("/")
    }).catch((error) => {
      console.error("Error adding document: ", error);
    });
  };

  render() {
    const { field1, field2 } = this.state;
    return (
      <div>
        <div>
          <div>
            <h3>
              Add Item
            </h3>
          </div>
          <div>
            <h4>Link to="/" >Items List</Link></h4>
            <form onSubmit={this.onSubmit}>
              <div>
                <label htmlFor="title">field1:</label>
                <input type="text" name="field1" value={field1} onChange={this.onChange}  />
              </div>
              <div>
                <label htmlFor="title">field2:</label>
                <input type="text" name="field2" value={field2} onChange={this.onChange} />
              </div>
              <button type="submit">Submit</button>
            </form>
          </div>
        </div>
      </div>
    );
  }
}

export default Create;
It seems a lot of code but the key points are [1] and [2] where we use the firestore SDK to add a new item in the database directly from the client app. the call in [2] is going to be revisited in next blog post to make usage of cloud function.

Step 6️⃣: Deploy on firebase

So we build a small test app accessing firestore DB let's deploy it on the cloud with Firebase tooling πŸ‘ !
  • Start running a production build
    $ npm run build
    
  • Install firebase tools
    $ npm install -g firebase-tools
    $ firebase login
    
  • Init function
    $ firebase init
    
    • Step 1: Select the Firebase features you want to use: Firestore Hosting. For now we focus only on deploying ie: hosting the app
    • Step 2: Firebase command-line interface will pull up your list of Firebase projects, where you pick firebase-crud.
    • Step 3: Keep the default for the Database Rules file name and just press enter.
    • Step 4: Pay attention to the question about public directory, which is the directory that will be deployed and served by Firebase. In our case it is build, which is the folder where our production build is located. Type “build” and proceed.
    • Step 5: Firebase will ask you if you want the app to be configured as a single-page app. Say "yes".
    • Step 6: Firebase will warn us that we already have build/index.html. All fine!
  • deploy!
    $ firebase deploy
    ...
    ✔  Deploy complete!
    
        Project Console: https://console.firebase.google.com/project/test-83c1a/overview
        Hosting URL: https://test-83c1a.firebaseapp.com
    


Where to go from there?

In this blog post you've seen how to configure and deploy an SPA on firebase and how to set up a Firestore DB. Next blog post, you'll see how to write you first Google Cloud Function. Stay tuned.

Thursday, September 13, 2018

Unpublish a npm package

Last week, I was playing with semantic-release. Giving your CI control over your semantic release. Sweet. I should dedicate a writing on it (to come later).
Nevertheless, I got in a situation that an erroneous version number get released (wrong commit message). Without a major version bump, a breaking change in the lib won't be reflecting (breaking the whole purpose of semantic release). 😱😱😱😱

Unpublish a "recent" version


If you try to unpublish a version just released:
$ npm publish .
+ launcher-demo@5.0.0
$ npm unpublish launcher-demo@5.0.0                   
- launcher-demo@5.0.0

It's ok! Pff you can do it. πŸ˜…πŸ˜…πŸ˜…πŸ˜…
Now is it possible later to publish the same version?
$ npm publish .                    
npm ERR! publish Failed PUT 400
npm ERR! code E400
npm ERR! Cannot publish over previously published version "5.0.0". : launcher-demo

It makes sense you can't use the same version, so if you update package.json to 5.0.1:
$ npm publish .
+ launcher-demo@5.0.1

Just fine!

Unpublish a "old" version


Let's say I want to unpublish a version released last week:
$ npm unpublish launcher-demo@3.2.8
npm ERR! unpublish Failed to update data
npm ERR! code E400
npm ERR! You can no longer unpublish this version. Please deprecate it instead

Thanks npm for your kind suggestion, let try to deprecate it with an short message:
$ npm deprecate launcher-demo@3.2.8 'erronous version'

At least now the package is visible as deprecated, trying to pull it will display a deprecate warning.
$ npm i launcher-demo@3.2.8
npm WARN deprecated launcher-demo@3.2.8: erronous version


Unpublish policy


"Old", "recent" version. What does it all mean? Let's check the npm unpublish policy

Quote: If the package is still within the first 72 hours, you should use one of the following from your command line:
  • npm unpublish -f to remove the entire package thanks to the -f or force flag
  • npm unpublish @ to remove a specific version

Some considerations:
Once package@version has been used, you can never use it again. You must publish a new version even if you unpublished the old one.
If you entirely unpublish a package, nobody else (even you) will be able to publish a package of that name for 24 hours.

After the one-developer-just-broke-Node buzzy affair in March 2016, the unpublish policies were changed. A 10-lines library used every where should not put the whole JS community down. A step toward more immutability won't arm.

Where to go from there


Error releasing your package?
You've got 72 hours to fix it. πŸ‘πŸ‘πŸ‘πŸ‘
otherwise deprecate it.
Maybe, it's time to automate releasing with your CI. πŸ˜‡πŸ˜‡πŸ˜‡πŸ˜‡


















Sunday, June 25, 2017

Dirty secrets on dependency injection and Angular - part 2

In the previous post "Dirty secrets on dependency injection and Angular - part 1", you've explored how DI at component level, can produce different instances of a service. Then you've experienced DI at module level. Once a service is declared using one token in the AppModule, the same instance is shared across all the modules and components of the app.

In this article, let's revisit DI in the context of lazy-loading modules. You'll see the feature modules dynamically loaded have a different behaviour.

Let's get started...

Tour of hero app


Let's reuse the tour of heroes app that you should be familiar with from our previous post. All source code could be find on github.

As a reminder, in our Tour of heroes, the app displays a Dashboard page and a Heroes page. We've added a RecentHeroCompoent that displays the recently selected heroes in both pages. This component uses the ContextService to store the recently added heroes.

In the previous blog, you've worked your way to refactor the app and introduced a SharedModule that contains RecentHeroCompoent and use the ContextService. Let's refactor the app to break it into more feature modules:
  • DashboardModule to contain the HeroSearchComponent and HeroDetailComponent
  • HeroesModule to contain the HeroesComponent


Features module


Here is a schema of what you have in the lazy.loading.routing.shared github branch:


DashboardModule is as below:
@NgModule({
  imports: [
    CommonModule,
    FormsModule,
    DashboardRoutingModule, // [1]
    HeroDetailModule,
    SharedModule            // [2]
  ],
  declarations: [
    DashboardComponent,
    HeroSearchComponent
  ],
  exports: [],
  providers: [
    HeroService,
    HeroSearchService
  ]
})
export class DashboardModule { }

In [1] you define DashboardRoutingModule.

In [2] you import SharedModule which defines common components like SpinnerComponent, RecentHeroesComponent.

HeroModule is as below:
@NgModule({
  imports: [
    CommonModule,
    FormsModule,
    HeroDetailModule,
    SharedModule,  // [1]
    HeroesRoutingModule
  ],
  declarations: [ HeroesComponent ],
  exports: [
    HeroesComponent,
    HeroDetailComponent
  ],
  providers: [ HeroService ] // [2]
})
export class HeroesModule { }

In [1] you import SharedModule which defines common components like SpinnerComponent, RecentHeroesComponent.
Note in [2] that HeroService is defined as provider in both modules. It could be a candidate to be provided by SharedModule. This service is stateless however. Having multiple instances won't bother us as much as a stateful service.

Last, let's look at AppModule:
@NgModule({
  declarations: [ AppComponent ], // [1]
  imports: [
    BrowserModule,
    FormsModule,
    HttpModule,
    SharedModule,     // [2]
    InMemoryWebApiModule.forRoot(InMemoryDataService),
    AppRoutingModule  // [3]
  ],
  providers: [],      // [4]
  bootstrap: [ AppComponent ],
  schemas: [NO_ERRORS_SCHEMA, CUSTOM_ELEMENTS_SCHEMA]
})
export class AppModule {}

In [1], the declarations section is really lean as most components are declared either in the features module or in the shared module.

In [2], you now import the SharedModule form AppModule. SharedModule is also imported in the feature modules. From our previous post we know, in statically loaded module the last declared token for a shared service wins. There is eventually only one instance defined. Is it the same for lazy-loading?

In [3] we defined the module for lazy loading, more in next section.

In [4], providers section is lean similar to declarations as most providers are defined at module level.

Lazy loading modules


AppRoutingModule is as below:
const routes: Routes = [
  { path: '', redirectTo: '/dashboard', pathMatch: 'full' },
  { path: 'dashboard',  loadChildren: './dashboard/dashboard.module#DashboardModule' }, // [1]
  { path: 'detail/:id', loadChildren: './dashboard/dashboard.module#DashboardModule' },
  { path: 'heroes',     loadChildren: './heroes/heroes.module#HeroesModule' }
]

@NgModule({
  imports: [ RouterModule.forRoot(routes) ],
  exports: [ RouterModule ]
})
export class AppRoutingModule {}

In [1], you'll define lazy load DashboardModule with loadChildren routing mechanism.

Running the app, you can observe the same syndrom as when we define ContextService at component level: DashboardModule has a different instance of ContextService than HeroesModule. This is easily observable with 2 different lists of recently added heroes.

Checking angular.io module FAQ, you can get an explanation for that behaviour:

Angular adds @NgModule.providers to the application root injector, unless the module is lazy loaded. For a lazy-loaded module, Angular creates a child injector and adds the module's providers to the child injector.

Why doesn't Angular add lazy-loaded providers to the app root injector as it does for eagerly loaded modules?
The answer is grounded in a fundamental characteristic of the Angular dependency-injection system. An injector can add providers until it's first used. Once an injector starts creating and delivering services, its provider list is frozen; no new providers are allowed.


What about if you what a singleton shared across all your app for ContextService? There is a way...

Recycle provider with forRoot


Similar to what RouterModule uses: forRoot. Here is a schema of what you have in the lazy.loading.routing.forRoot github branch:



In SharedModule:
@NgModule({
  imports: [
    CommonModule
  ],
  declarations: [
    SpinnerComponent,
    RecentHeroComponent
  ],
  exports: [
    SpinnerComponent,
    RecentHeroComponent
  ],
  //providers: [ContextService], // [1]
  schemas: [NO_ERRORS_SCHEMA, CUSTOM_ELEMENTS_SCHEMA]
})
export class SharedModule {

  static forRoot() {            // [2]
    return {
      ngModule: SharedModule,
      providers: [ ContextService ]
    }
  }
 }

In [1] remove ContextService as a providers. Define in [2] a forRoot method (the naming is an broadly accepted convention) that returns a ModuleWithProviders interface. This interface define a Module with a given list of providers. SharedModule will reuse defined ContextService provider defined at AppModule level.

In all feature modules, imports SharedModule.

In AppModule:
@NgModule({
  declarations: [
    AppComponent
  ],
  imports: [
    BrowserModule,
    FormsModule,
    HttpModule,
    //SharedModule, // [1]
    SharedModule.forRoot(), // [2]
    InMemoryWebApiModule.forRoot(InMemoryDataService),
    AppRoutingModule
  ],
  providers: [],
  bootstrap: [
    AppComponent
  ],
  schemas: [NO_ERRORS_SCHEMA, CUSTOM_ELEMENTS_SCHEMA]
})
export class AppModule {
}

In [1] and [2], replace the SharedModule imports by SharedModule.forRoot(). You should only call forRoot at highest level ie: AppModule level otherwise you will run in multiple instances.

To see the source code, take a look at lazy.loading.routing.forRoot github branch:

Where to go from there


In this blog post you've seen how providers on lazy-loaded modules behaves differently that in an app with eagerly loaded modules.

Dynamic routing brings its lot of complexity and can introduce difficult-to-track bugs in your app. Specially if you refactor from statically loaded modules to lazy loaded ones. Watch out your shared module specially if they provide services.

The Angular team even recommends to avoid providing services in shared modules. If you go that route, you still have the forRoot alternative.

Happy coding!

Friday, June 16, 2017

Dirty secrets on dependency injection and Angular - part 1

Let's talk about Dependency Injection (DI) in Angular. I'd like to take a different approach and tell you the stuff that surprise me when I've first learned them using Angular on larger apps...

Key feature from Angular even since AngularJS (ie: Angular 1.X), DI is a pure treasure from Angular, but injector hierarchy can be difficult to grasp at first. Add routing and dynamic load of modules and all could go wild... Services get created multiple times and if stateful (yes functional lovers, you sometimes need states) the global states (even worse πŸ˜…) is out of sync in some parts of your app.
To get back in control of the singleton instances created for your app singleton, you need to be aware of a few things.

Let's get started...

Tour of hero app


Let's reuse the tour of heroes app that you should be familiar with from when you first started at angular.io. Thansk to LarsKumbier for adapting it to webpack, I've forked the repo and adjust it to my demo's needs. All source code could be find on github.

In this version of Tour of heroes, the app displays a Dashboard page and a Heroes page. I've added a RecentHeroCompoent that displays the recently selected heroes in both pages. This component uses the ContextService to store the recently added heroes.


See AppModule in master branch.

Provider at Component level


Let's go to HeroSearchComponent in src/app/hero-search/hero-search.component.ts file and change the @Component decorator:
@Component({
  selector: 'hero-search',
  templateUrl: './hero-search.component.html',
  styleUrls: ['./hero-search.component.css'],
  providers: [ContextService] // [1]
})
export class HeroSearchComponent implements OnInit {

if you add line [1], you get something like this drawing:



Run the app again.
What do you observe?
The heroes page is working fine listing below the recently visited heroes. However going to Dashboard/SearchHeroComponent, the recently visited heroes list is empty!!

The recently added heroes is empty in HeroSeachComponent because you've got a different instance of ServiceContext. Dependency injection in Angular relies on hierarchical injectors that are linked to the tree of components. This means that you can configure providers at different levels:
  • for the whole application when bootstrapping it in the AppModule. All services defined in providers will share the same instance.
  • for a specific component and its sub components. Same as before but for Γ  specific component. so if you redefine providers at Component level, you got a different instance. You've overriden global AppModule providers.

Tip: don't have app-scoped services defined at component level. Very rare use-cases where you actually want


Provider at Module level


What about providers at module level, if we do something like:



Let's first refactor the code, to introduce a SharedModule as defined in angular.io guide. In your SharedModule, we put the SpinnerComponent, the RecentHeroComponent and the ContextService. Creating the SharedModule, you can clean up the imports for AppModule which now looks like:

@NgModule({
  declarations: [
    AppComponent,
    HeroDetailComponent,
    HeroesComponent,
    DashboardComponent,
    HeroSearchComponent
  ],
  imports: [
    BrowserModule,
    FormsModule,
    HttpModule,
    SharedModule,
    InMemoryWebApiModule.forRoot(InMemoryDataService),
    AppRoutingModule
  ],
  providers: [
    HeroSearchService,
    HeroService,
    ContextService
  ],
  bootstrap: [
    AppComponent
  ]
})
export class AppModule {}

Full source code in github here. Notice RecentHeroComponent and SpinnerComponent has been removed from declarations. Intentionally the ContextService appears twice at SharedModule and AppModule level. Are we going to have duplicate instances?

Nope.
A Module does not have a specific injector (as opposed to Component which gets their own injector). Therefore when AppModule provides a service for token ContextService and imports a SharedModule that also provides a service for token ContextService, then AppModule's service definition "wins". This is clearly stated in AppModule angular.io FAQ.

Where to go from there


In this blog post you've seen how providers on component plays an important role on how singleton get created. Modules are a different story, they do not provide encapsulation as component.
Next blog posts, you will see how DI and dynamically loaded modules plays together. Stay tuned.