Optimize Product Development with Teamcenter
Make your manufacturing and Engineering-to-Order projects smoother with Siemens Teamcenter. Teamcenter helps you work better together, cut costs, and scale as your needs change.

The CLEVR way: From vision to value
At CLEVR, we don’t just implement technology—we enable transformation. Our approach ensures that companies don’t just digitize but truly evolve by embedding Low Code, PLM, and MOM solutions in a structured, scalable way.
Key NX Features

Integrated Design, Simulation, and Manufacturing
Combine all aspects of product development into a single environment, reducing design iterations and accelerating time-to-market.

Integrated Design, Simulation, and Manufacturing
Combine all aspects of product development into a single environment, reducing design iterations and accelerating time-to-market.

Integrated Design, Simulation, and Manufacturing
Combine all aspects of product development into a single environment, reducing design iterations and accelerating time-to-market.

Integrated Design, Simulation, and Manufacturing
Combine all aspects of product development into a single environment, reducing design iterations and accelerating time-to-market.
Why CLEVR?

- Proven Expertise: 20 years of low code experience, 3,500+ applications delivered.
- Tailored Solutions: A unique "Vision to Value" methodology ensuring measurable results.
- Global Recognition: Mendix Platinum Partner, awarded Best BNL Partner 2024.
- Customer Satisfaction: Score of 8.8 out of 10, reflecting our commitment to excellence.
- Certified Professionals: The largest team of Mendix expert developers and MVPs.
- Proven Expertise: 20 years of low code experience, 3,500+ applications delivered.
Compare licensing plans
Advanced
Create and edit designs of typical 3D parts and assemblies and more with NX X Design Standard.
Standard
Create and edit designs of typical 3D parts and assemblies and more with NX X Design Standard.
Premium
Create and edit designs of typical 3D parts and assemblies and more with NX X Design Standard.
Verhalen van onze klanten
Bekijk hoe bedrijven zoals het uwe veranderen met CLEVR.
Mendix allows us to rapidly adapt to new legal demands and security updates.



I think we build tomorrow together in different ways. We try to build the future by providing equipment to produce green hydrogen to enable the green transition, and CLEVR with the information technology will help us to do that efficiently




Find out how CLEVR can drive impact for your business
We try to build the future by providing equipment to produce green hydrogen to enable the green transition.
Related Resources

IoT - Service with a human touch | CLEVR
1990 – H2H
Hello, this is Jim speaking, how may I help you?
Does this sound familiar?
This was in a time where people called Jim (or preferably visit him at his desk) and explained their problem to him. Jim was understandable and comforting and tried to fix the problem immediately, or otherwise created a ticket in the helpdesk system for later reference.
Discussions, reactions, emotions and explanations were used to determine the urgency of tickets and the necessity of fixing it in a certain timeframe. Human to human (H2H) interaction at his best.
Jim met (and in most circumstances knows) the person that is experiencing the problem and therefore has a “personal” connection with him or her. Therefore solving the ticket and communicating progress is a “normal” part of his work ethics.
2000 – M2H
A couple of years later, more and more automation was added to support processes in where problems were detected before customers even realized that the problem existed. Interfaces with monitoring systems allowed organizations to log tickets automatically and dispatch them to Jim. You can see this as the Machine to Human phase (M2H). From an efficiency perspective, this was a good idea, but from a customer intimacy point of view, it was not.
Although Jim is solving tickets that are logged by a monitoring system, there are actual customers behind these affected services. Since Jim does not know exactly who these customers are, he does not have a personal connection with them. He does not have the feeling he is helping a person, but more or less helps the “system”. He loses some of the purpose of resolving the issue and the chance that the solution does not fit the customers’ expectation is substantially.
TODAY – M2M
The world is rapidly transforming into an internet connected world. Devices are getting smarter, they are designed to help us, are always running and are everywhere around us. Your phone, watch, TV, laptop and TV Box are some examples of these devices, but more and more devices are launched daily. In the near future our homes, cars and bikes will become sophisticated devices. Offices, hospitals, shopping centers, movie theaters and grocery shops will have sensors, beacons and tokens to determine who you are, where you are and what you are doing (and did). And when you are not in a building, the outdoors will have sensors as well. Examples of this are automatic street lightning that turns on and off when someone approaches, find empty parking spaces in a crowded city, trash pickup when needed and crowd management, the so called “Smart Cities”.
All these devices will collect data involving their surroundings and can be used for different purposes. Also service related information will be present and is used to feed the service management tooling if something goes wrong. By triggers and actions that perform automatic restarts, are doing device resets, rollout firmware upgrades or disable certain functionality, the human factor is completely removed from the service process. Do we really think that this machine 2 machine (M2M) interaction is a good idea?
You know what… Let’s fire Jim, we don’t need him anymore.
Or do we?
At first glance, automating the service process is a good idea. The more automation you achieve, the more efficient your service process will be. But when there is customer interaction, you have to think twice. You do not only want to have the service restored as quickly and efficient as possible, but you also need to keep your customers in the loop. Who is paying for your service, the customer or the “machine”?

NEAR FUTURE – M2M2H
Hello, this is Jim speaking, we noticed that you had to make an emergency break with your car on route 19. Are you okay? Do you need any medical attention or are you in need of the emergency services?
Oh, it is good to hear that everything is alright. Can I assist you with anything else?
Machine to Machine to Human (M2M2H) interaction at his best.

5 Reasons why you should use intelligence as a service
First of all, let me explain why I used the term Intelligence as a Service in my title. As you might know, the abbreviation IaaS is already being used for Infrastructure as a Service. Therefore, you won't come across ‘Intelligence as a Service’ as often as you might expect. While I was writing this article, I came across an article about AI (Artificial Intelligence) as a Service. In my perception, not all services in this blog include an ‘artificial’ part. Therefore, I decided to keep the title as it is.
Most of the solutions I will be writing about are currently being sold as PaaS (Platform as a Service), SaaS (Software as a Service) or BIaaS (Business Intelligence as a Service). In my opinion, the way these services are provided (often by only using an API) and the combined compute, intelligence and storage capacity, make them a special kind of cloud service. Business Intelligence as a Service might fit this context, but this term is often highly associated with management information.
In this article, you will discover which services are currently available and why you should consider implementing them in your app. Especially if your company is already working with a multi-app license, I think it is a great investment to discover new business potential. In most cases, I will name services from specific cloud providers as an example. This does not necessarily mean that this is the best provider for your business.
1. Artificial Intelligence : An interesting and sexy business case
Phenomena like AI (Artificial Intelligence) and ML (Machine Learning) are quite new to businesses, and not everyone inside your company will know what is currently possible when using these ‘new’ techniques. Answering business requests by responding with an AI or ML solution will create more awareness and potentially means a more innovative way of thinking. Already available data has the potential to manage processes more efficiently and eventually generate more profit as well as a higher level of customer satisfaction.
Short example; The management department asks you to think about a solution for the decreasing customer satisfaction with regards to the support ticket handling. One of your suggestions could be the implementation of an intelligent API which determines the sentiment of all incoming support tickets. You can use the result of this action to prioritize the tickets based on the customers mood instead of only the urgency dropdown which may be part of your default ticket template.
Of course, everything comes with a price. Most cloud solutions can be implemented for a quite low and, most important, predictable cost. Especially when you are only evaluating the added value of a service, a free key is often available. In order to create a simple showcase app, the combination between the well documented API's and the Mendix platform is a very powerful one.
Because of the (often) clear pay-per-usage constructions, it is doable to estimate the ROI accurately. In addition to that, most services can be cancelled at any moment. Together with the fast, innovative and down-to-earth solutions, implementing Intelligence as a Service is an interesting business case.
2. Easy to implement
In the paragraph above, I mentioned the ease of creating a new account and retrieving new keys. Most cloud providers offer these constructions, which makes it easy for your Business- and Technical Engineers to experiment and eventually implement the available services. In addition, most cloud providers (Amazon, IBM, Microsoft, etc.) are offering well documented Java SDK's which can be used in Mendix as well. Both Amazon and IBM already have connectors available in the Mendix App Store, which makes it even easier to implement. As an example, you can use the IBM Watson Connector suite to use the Conversation services without the need to write one single line of code yourself. You can find more information about the available connectors on the Mendix website.
Quite some services do not have a Mendix connector available at this moment. One of the challenges I ran into, was the wide variation of authentication methods. It can be a challenge to build one authentication flow for several scenarios, but building it will help you understand the inner working of the service. If you have the knowledge available to use a Java SDK instead, it can save you several hours when developing your prototype.
3. Scalable Innovation
Imagine a scenario in which your smart app is accepted by the business. A small group of people see the business potential and they ask for the implementation of this feature. After the production release, more and more users are using the feature. As I mentioned before, several services are working with a ‘requests-per-month’ license. Question is, can this become a problem? If the service is not responding, because we used up our license, this could unnecessarily affect the end user experience.
Like most cloud services, many AI services are highly scalable as well. From my experience, especially Amazon has a very well managed scaling mechanism. For example, you can choose to automatically scale your license up and down. In addition to that, it is also possible to choose between a variety of redundancy settings. When the data you’re processing has a high business criticality, you may want to choose a geo-redundant setup. In that situation, the service or data is available, even if a specific datacenter is down.
If you’re interested in using a service which is being offered by multiple vendors, it is advisable to take this in to consideration too when comparing the selected providers.
4. Empowering your IoT
If you’re part of an already innovative organization, it is possible that IoT is already on the development roadmap. In most cases, this data is being used as an event trigger or in dashboards. The raw data can be displayed as a chart while the most recent value is displayed as a dynamic number.
In most scenario’s, only the raw sensor data doesn’t tell you anything valuable. Therefore, you need to aggregate this data first. Several cloud providers provide solutions for exactly this purpose. Some examples are IBM Stream Computing, Microsoft Azure Stream Analytics and Amazon Kinesis Analytics.
Most of the time, the basic configuration is simple and could be accomplished within a day. First, you’ll define your source. For now, this will be your sensor endpoint (for example: Microsoft Event Hub, Amazon Kinesis and the IBM Event Hub). After that, you can use your SQL knowledge for aggregating the incoming data by writing a query. Because it is possible to configure multiple endpoints, you can setup your own Lambda architecture with both a hot and cold stream. If you want to read more about a Lambda architecture, you can use this article as a starting point.

Figure 1 An example of a Lambda Architecture Source: https://github.com/awantik/pyspark-tutorial/wiki/Data-Processing-Architectures
Short example; When using Microsoft Stream Analytics, you can setup a hopping window to achieve a constant and aggregated dataset. If you have a changing number of input events per time unit, your SQL query can take the average value from the last X minutes and save this every minute. In addition to that, you may want to store the raw data in a NoSQL database.

Figure 2 Hopping Window Source: https://msdn.microsoft.com/nl-nl/library/azure/dn835041.aspx
5. Ongoing Improvements
Last but not least, your solutions will continue to advance without adding one line of code. One of the AI fundamentals is the fact that it gets better by learning. More organizations are using general available AI API’s, and as a consequence the model will become more accurate over time. This causes that, for example, Speech-to-Text services become more accurate every month.
If you think that the general available AI services does not fit your needs, you can also choose to build the model yourself. In most cases, you can use Machine Learning services in order to achieve this goal. Understanding Machine Learning is a discipline on its own.
Recently, Microsoft published a promotion video in which they showcase their new anomaly detection for Live Video’s. These features aren’t available yet, but it gives you a brief insight in the future possibilities.
Conclusion
To conclude this all; just give it a try! According to Gartner, AI, ML and Intelligent Apps are some of the top 10 technology trends for 2017. Also, Mendix itself is focusing more on smart apps in the future. Several weeks ago, they published a great blog about the usage of IBM Watson Conversation in a fully automated chatbot. Definitely worth reading if you’re interested in using AI in one of your projects.
Hopefully you now have a broader understanding of the possibilities and available solutions involving AI. Want to know more about the technical implementation of such services? Please let us know in the comments.

Managing Java dependencies for Mendix modules | CLEVR
If you have ever tried to implement a certain function as a reusable module in Mendix, there’s a good chance you’ve had to add some jar dependencies. There are multiple ways of achieving that goal. But there’s one method that I find usually works best, because it simultaneously addresses multiple issues.
But first, let me rank the game-breaking or simply annoying issues I have encountered when trying to implement a reusable module:
- If you want to use a jar file in your module, e.g. pdfbox v2.3, but another version of this jar e.g. v1.8 is already being used in a another module of your application, you are out of luck. As far as I know, this is simply not possible. The class loader will pick up only one version, leaving one of the two modules to deal with the wrong version, which usually results in the application hanging or crashing. To make matters worse, this only occurs when invoking a specific java action with a conflicting dependency at runtime.
- If you are maintaining a Mendix module and you need to update a jar file within it, you need to make sure everyone deletes the old jar on update. Otherwise, you end up having two jars with a different version in your classloader path, which again likely leads to the application not working.
- Mendix requires you to manage transitive dependencies manually. This usually means you have to run the code to see which classes are missing, then find the right jar file and add it to the project manually. Then, rinse and repeat until there are no more “NoClassFound” errors.
- Wasting time when exporting modules is my next pet peeve. There are too many jars to include/exclude, especially if you are using something like community commons or the rest module in your project. Of course, that could be solved simply by a select/deselect button in the export dependencies dialog, but that is not the point of this post. One way or another, time is lost.
One way to effectively deal with all of these issues is to employ a build tool to produce a so-called fat jar, a single jar file that contains all your dependencies. This on its own resolves multiple issues from the previous list, including: rom the fat jar file will be updated automatically when the module is updated, thus solving point 2; the build tool can take care of all the transitive dependencies, thus solving point 3, and finally the only dependency is a single one-jar file, eliminating point 4.
The first issue we mentioned can be resolved using a technique called shadowing. Shadowing replaces patterns in class names with a given string. For example, you can replace org.json with community.commons.org.json. This lets the Java classloader load two versions of the org.json library because they have different class names.
Case study: community commons
What better module to demonstrate the techniques described above than the community commons? In its current status, it has some 20 or so dependencies (check out that scroll bar).
In this instance, I will be using Gradle. But you can opt to do the same thing using other tools e.g. Maven or JarJarLinks.

Adding dependencies to Gradle
First, I installed the Gradle Eclipse plugin after downloading it from the Eclipse marketplace. Next, I created a new Gradle project. The main file in every gradle project is the gradle build script, where many options can be specified, such as which Java version to use. The dependencies for a Gradle project are also defined in the build script. As you can see below, my first iteration of the build script is mostly standard stuff with the exception of the shadowing tool that I added:
buildscript {
repositories {
mavenCentral()//look for dependencies here
}
dependencies {
classpath "com.github.jengelman.gradle.plugins:shadow:2.0.0"
//this is the tool we will use to build the fat jar and shadow it
}
}
plugins {
id 'com.github.johnrengelman.shadow' version '2.0.0'
id 'java'
}
group 'com.mendix.community-commons'
version '1.0.0'
apply plugin: 'java'
apply plugin: 'maven'
apply plugin: 'eclipse'
task wrapper(type: Wrapper) {
gradleVersion = '3.0'
}
compileJava {
sourceCompatibility = 1.8
targetCompatibility = 1.8
}
repositories {
mavenLocal()
mavenCentral()
}
dependencies {
}
So far so good. Next, I started adding the dependencies from the community commons module:
dependencies {
// https://mvnrepository.com/artifact/org.owasp.antisamy/antisamy
compile group: 'org.owasp.antisamy', name: 'antisamy', version: '1.5.3'
// https://mvnrepository.com/artifact/com.google.guava/guava
compile group: 'com.google.guava', name: 'guava', version: '14.0.1'
// https://mvnrepository.com/artifact/commons-codec/commons-codec
compile group: 'commons-codec', name: 'commons-codec', version: '1.10'
// https://mvnrepository.com/artifact/org.apache.pdfbox/jempbox
compile group: 'org.apache.pdfbox', name: 'jempbox', version: '1.8.5'
// https://mvnrepository.com/artifact/joda-time/joda-time
compile group: 'joda-time', name: 'joda-time', version: '2.9.6'
// https://mvnrepository.com/artifact/commons-fileupload/commons-fileupload
compile group: 'commons-fileupload', name: 'commons-fileupload', version: '1.2.1'
// https://mvnrepository.com/artifact/commons-io/commons-io
compile group: 'commons-io', name: 'commons-io', version: '2.3'
// https://mvnrepository.com/artifact/org.apache.commons/commons-lang3
compile group: 'org.apache.commons', name: 'commons-lang3', version: '3.0'
compile group: 'org.apache.servicemix.bundles', name: 'org.apache.servicemix.bundles.batik', version: '1.8_1'
// https://mvnrepository.com/artifact/org.apache.pdfbox/pdfbox
compile group: 'org.apache.pdfbox', name: 'pdfbox', version: '2.0.3'
// https://mvnrepository.com/artifact/xerces/xercesImpl
compile group: 'xerces', name: 'xercesImpl', version: '2.8.1'
}
I noticed that some of the dependencies are not listed in Maven Central (or at least I could not find them). No problem—I have the jar files from the community commons project on GitHub. I created a folder libs in my Gradle project, and then added the com.springsource.org.apache.batik.css-1.7.0.jar and nekohtml.jar to it. Then, I added the following line to my dependencies, which, as you might expect, adds all jar files from the libs folder to the gradle project.
compile fileTree(dir: 'libs', include: '*.jar')
Doing this, we have now resolved all Java dependencies.
Dealing with Mendix classes
My basic premise is to use the Java action to only call a corresponding function from the fat jar that I create. This is already the case for most Java actions in community commons, i.e. the actual implementation code is not inside the java action class. Instead, execution is delegatad to another class. I copied the classes from the communitycommons package where the logic is implemented, specifically ConversationLog, DateTime, ORM, StringUtils, etc., to my Gradle project. I built the project and refreshed the build path in Eclipse. Success ... at least somewhat. The dependencies are loaded and recognized by Eclipse, but I can see some missing classes. Most of the missing classes are from the Mendix API, which is used extensively in the community commons in the form of IContext, IMendixObject, Core, etc. I added them to the dependencies as well.
compile files('C:/Program Files/Mendix/7.2.0/runtime/bundles/com.mendix.public-api.jar')
compile files('C:/Program Files/Mendix/7.2.0/runtime/bundles/com.mendix.logging-api.jar')
I need to include these classes when developing in Eclipse because the compiler has to recognize them. Otherwise, the project will not compile. However, I do not want the Mendix API classes to go into my fat jar, so I excluded them.
shadowJar {
dependencies {
exclude 'com/mendix/**'
}
}
Rebuild and refresh, a lot of unrecognized classes are now ok. Here is a screenshot of my gradle project at this point:

But I can still see a few imports which are unrecognized. These were enumerations and proxy classes from the Mendix Modeler, such as system.proxies.FileDocument. I could not think of a way to add them to my list of dependencies, so I decided to wrap them. I declared an interface IFileDocument with the following code:
public interface IFileDocument {
public IMendixObject getMendixObject();
public boolean getHasContents();
}
I will only work with this interface within the fat jar. Then, I added a method to convert a system.proxies.FileDocument to an org.community-commons.main.IFileDocument. Every time a Java action needs to call some method from my fat jar file that works with files, I do a conversion, and then pass the interface.

That just leaves the enumerations and the system.proxies.Language. For the enumerations e.g. communitycommons.proxies.LogLevel, I used a similar wrapping method to the one I employed for the FileDocument. For the Language I decided to just keep the whole code (all four lines of it) in the Java action. It has no other external dependencies, so why bother? If needed, though, it can be wrapped in a similar way. I did another rebuild and added the fat jar to the project. After everything compiles, these are the dependencies that remain in the modeler.

Let me make a small digression here to talk about wrapping Mendix proxy classes. This solution is clearly not what you would normally want to do. But on the other hand, it is mostly mechanical work. The way I see things, this means there is some way to automate it. A tool can be developed that generates code for the wrapper classes. You could even go a step further and have the tool replace all usages of proxy classes with the corresponding wrapper classes.
Adding resources to a fat jar
As you can see in the screenshot, we managed to get rid of most files, with the exception of the antisamy XML files. I could just leave them as they are and everything would work fine, but they are really bugging me. For the sake of being thorough, I will demonstrate how any resource files can be included in the fat jar. But this is clearly optional.

As you probably already know, a jar file is just a zip file, which means we can include any file we want in it. In fact, Gradle automatically considers any files that are in a folder named resources as resource files and adds them to the jar. I created such a folder under src/main and copied all the XML files there. To test that the resources are really added, I did a rebuild and then extracted the jar file. I could see that all XML files are really included in the fat jar.
Next, I had to change the way the xml files are read. Previously they were loaded from the resources folder just like regular files. However, this is not possible once they are packaged in a jar file. Instead, you should use the classloader to load them as resources. I changed the following lines in StringUtils.XSSSanitize (which is part of the community commons):
String filename = Core.getConfiguration().getResourcesPath() + File.separator
+ "communitycommons" + File.separator + "antisamy"
+ File.separator + "antisamy-" + policyString + "-1.4.4.xml";
Policy p = Policy.getInstance(filename);
Instead of using a filename, I use the classloader to get the file contents as a stream:

Finally, we are left with only three dependencies:

- java that contains classes to interface with the fat jar,
- the fat jar file, and
- txt - a license file.
Because I do not see a way to reduce the number of dependencies any further, we can now move on to the next topic:
Shadowing
To keep it simple, I will just prepend the cc_ prefix to all external classes with the following code:

shadowJar {
relocate('org.apache', 'cc_org.apache')
relocate('org.cyberneko', 'cc_org.cyberneko')
relocate('org.joda', 'cc_org.joda')
relocate('org.owasp', 'cc_org.owasp')
relocate('org.w3c', 'cc_org.w3c')
relocate('org.xml', 'cc_org.xml')
relocate('javax', 'cc_javax')
relocate('java_cup', 'cc_java_cup')
relocate('com.google', 'cc_com.google')
dependencies {
exclude 'com/mendix/**'
}
}
Gradle will automatically replace the matching class names both in the files where these classes are defined and in the files where these classes are used.
Notice how I left out the org.community_commons package. These classes do not involve any third-party dependencies, so there is no danger of conflicts arising.
By opening the jar archive, I can confirm that all the classes that come from an external dependency are, in fact, renamed. That is all that's needed. Now, developers who use community commons in their project can use libraries that are also used by the community commons module without worrying about conflicts.

For example, let us assume we want to use the pdfbox library from Apache that is used by community commons also because there is a newer version with some feature we like, or there is component that has a dependency on an earlier version of pdfbox. That is now possible. We can add the pdfbox.jar to the userlib folder and use it in our code without running into any dependency-related issues.
At this point, I would like to stress that although it is technically possible to use shadowed classes in your Mendix project, as shown in this screenshot, this really should be avoided at all cost. There is no guarantee that the shadowed class will not be removed/renamed, and thereby no longer available when the module is updated.
Final thoughts
Packaging and shadowing dependencies correctly goes a long way to prevent headaches when using Mendix modules. My hope is these examples will motivate developers of app store modules to start managing their internal dependencies in a way that makes everybody’s life a bit easier. Perhaps if there is enough interest and support from the community, the process of shadowing jar file dependencies also will become part of some future Mendix best practices document for module—or, better still, we will get an integrated build/shadowing tool inside the Mendix modeler.
I hope you found this post interesting. If you have any remarks or ideas on how to improve this post or the code herein, please reach out to me. You can also check out the complete project on Github.
Happy coding!
-Andrej Gajduk
Frequently Asked Questions
Which industries does CLEVR serve?
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
How does CLEVR support digital transformation?
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
What is CLEVR's experience and reach?
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.
Who are some of CLEVR's notable clients?
Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.