Upcoming Webinar with Cameron Gregor: How We Are Using XPages

Aug 17, 2019 10:05 AM

Tags: xpages

I ended my XPages post the other day with a request for people who are working on large XPages applications to hit me up on Twitter to tell me about them. Shortly thereafter, the estimable Cameron Gregor did just that. Moreover, he had the suggestion of turning the discussion into an open webinar, so that others can join.

He made a post on his site with the details and a handy time-zone table to account for our respective locations, and the summary is:

I'm pretty curious to take a look myself, and I hope you'll join us in just over a week!

Developing an Open/WebSphere Liberty UserRegistry with Tycho

Aug 16, 2019 3:08 PM

In my last post, I put something of a stick in the ground and announced a multi-blog-post project to discuss the process of making an XPages app portable for the future. In true season-cliffhanger fashion, though, I'm not going to start that immediately, but instead have a one-off entry about something almost entirely unrelated.

Specifically, I'm going to talk about developing a custom UserRegistry and TrustAssociationInterceptor for Open Liberty/WebSphere Liberty. IBM provides documentation for this process, and it's alright enough, but I had to learn some specific things coming at it from a Domino perspective.

What These Services Are

Before I get in to the specifics, it's worth discussing what specifically these services are, especially TrustAssociationInterceptor with its ominous-sounding name.

A UserRegistry class is a mechanism to provide a Liberty server with authentication and user info services. Liberty has a couple of these built-in, and the prototypical ones are the basic and LDAP registries. Essentially, these do the job of the Directory and Directory Assistance on Domino.

A TrustAssociationInterceptor class is related. What it does is take an incoming HTTP request and look for any credentials it understands. If present, it tells Liberty that the request can be considered authenticated for a given user name. The classic mechanisms for this are HTTP Basic and form-cookie authentication, but this can also cover mechanisms like OAuth. In Domino, this maps to the built-in authentication mechanisms and, more particularly, to DSAPI filters.

How I Used Them

My desire to implement these developed when I was working on the Domino Open Liberty Runtime. I wanted to allow Liberty to use the containing Domino server as a user registry without having to enable LDAP and, as a stretch goal, I wanted to have some sort of implicit SSO without having to configure LTPA.

So I ended up devising something of an ad-hoc directory API exposed as a servlet on Domino, which Liberty could use to make the needed queries. To pair with that, I wrote a TrustAssociationInterceptor implementation that looks for Domino auth cookies in incoming requests, make a call to a small servlet with that cookie, and grabs the associated username. That provides only one-way SSO, but that's good enough for now.

The Easy Part

The good part was that my assumption that my comfort with Tycho going in would help was generally correct. Since the final output I wanted was a bundle, I was able to just add it to my project structure like any other, and work with it in Eclipse's PDE normally. Tycho and PDE didn't necessarily help much - I still had to track down the Liberty API plugins and make a local update site out of them, but that was old hat by this point.

What Made Development Weird

I went into the project in high spirits: the interfaces required weren't bad, and Liberty uses OSGi internally. I figured that, with my years of OSGi experience, this would be a piece of cake.

And, admittedly, it kind of was. The core concepts are the same: building with Tycho, bundle activators, MANIFEST.MF, and all that. However, Liberty's use of OSGi is, I believe, much more modern than Domino's, and certainly much less focused on Equinox specifically.

For one, though Liberty is indeed OSGi-based, it doesn't use Maven+Tycho for its build process. Instead, it uses Gradle and the often-friendlier bnd tooling to handle its OSGi composition. That's not too huge of a difference, and the build process doesn't really affect the final built feature. The full differences are a whole big topic on their own, but the way they shake out for this purpose is essentially a difference in philosophy, and the different build mechanism was something of a herald of the downstream distinctions.

One big way this shows is in service registration. Coming from an Eclipse heritage, Equinox-based apps tend to use "plugin.xml" to register services, Liberty (and most others, I assume) favors programmatic registration of services inside the bundle activator. While this does indeed work on Equinox (including on Domino), this was the first time I'd encountered it, and it took some getting used to.

The other oddity was how you encapsulate your bundle as a feature in Liberty parlance. Liberty uses the term "feature" to refer to individual components that make up the server, and which you can configure in the "server.xml" file. These are declared using files similar to MANIFEST.MF with specialized headers to declare the name of the feature, the bundles that make it up, and any APIs it provides to the server and apps. In my case, I wrote a generic mechanism to deploy these features when a server is established, which writes the manifest files to the server's feature directory. Once they're deployed, they become available to the server as a feature with the "usr" prefix, like "usr:dominoUserRegistry-1.0" for my case.

In The Future

I have some ideas for additional features I'd like to develop - providing implicit APIs for Darwino and Jakarta NoSQL/JNoSQL would be handy, for example. This way went pretty smoothly, but I'll probably develop non-Domino ones using either Gradle or Maven with the maven-bundle-plugin. Either way, it ended up fairly pleasant once I discarded my old assumptions, and it's another good entry in the "pros" column for Liberty.

How Do You Solve a Problem Like XPages? Redux

Aug 13, 2019 5:33 PM

Tags: xpages

The better part of a year ago, I mused about what to do about the XPages problem. For better or for worse, I'm still musing about it to this day, without a good answer. The nice part for me is that it's not actually my job to come up with a real plan to "solve" XPages in a grand scale, but I do have my own set of clients I'm responsible for, and I've been personally invested in it for so long that I'd like to be involved in bringing it to a safe landing.

Moving Away

And I do think that that "landing" almost definitely has to be a path from XPages to something non-XPages. Hypothetically, a path forward would be for HCL to staff up on a new XPages team and improve the platform. Even if they did, though, I don't think it'd be wise for customers to rely on that - not even because I'd doubt the intent, but rather because any single-vendor, closed-source web stack without a large developer community to buffer it is an unreliable foundation.

If it'd be unsafe to rely on a revitalized platform, I think it's certainly unsafe to rely on one that's clearly in maintenance mode. The nature of web development is such that a stationary platform may as well be moving backwards, and not just because it will miss out on Web Workers or other new technology. Sooner or later, Safari or Chrome will remove a capability for security purposes and either Domino or the Dojo version XPages uses will be caught flat-footed. We've already been dealing with different definitions of "define" and various security improvements tripping us up for pretty much the entire life of XPages, and that kind of thing certainly isn't going to go away. Heck, how long to you figure user agents are going to remain reliable? It's unfortunate, but the fact that XPages came out of that "Web 2.0" era means a given page is less likely to function properly in five years than a JavaScript-free page made in 1995 that's still going strong.

Candidates for a Path

So I do think it's important to have a path, but it's not yet defined. A couple candidates spring to mind for me, but each one has one major drawback or another:

Hoist XPages Back to Jakarta EE

By this I mean taking XPages more-or-less as-is and running it on a normal JEE server. This certainly works, and access to the source would let me make it work better, but it kind of kicks the can down the road. XPages itself would still be moribund, and just running it on, say, a Domino-connected Open Liberty runtime wouldn't magically make it modern.

This would, though, provide some breathing room to manipulate an app in a better environment and transform it gradually. A JEE app can use a number of technologies that an XPages app can't currently, and so this would be a way to migrate the code without ever having a hard cutoff.

Bring to XSP Components to JSF 2.x

This is essentially the "upgrade JSF" request that has followed XPages since just after its birth, but with the slightly-lower goal of leaving XPages-the-stack where it is but making a copy of the components and infrastructure that can be brought into a normal JSF runtime like any other set of components. This would possibly be the hairiest of all options, since it wouldn't really be worth it unless things work near-100%, and there are so many little edge cases that it's harrowing to think about. Take the _xsp* methods grafted onto XPages's javax.faces.component.UIComponent alone, or whatever weird ways the XPages Ajax model differs from JSF's.

Still, it'd be doable if desired, and it'd provide a reasonable path to progressively "melt" the XSP components down until they're very-thin wrappers over normal JSF stuff, until they don't really diverge at all. With infinite resources, this would probably be the nicest route.

Try Transforming XSP Markup

By this I mean two similar possibilities: making an XSP-to-Java "compiler" that emits stock JSF components, or one-pass transforming XSP XML into JSF-compatible XML, though I think only the latter would be worth pursuing. While this could potentially rival the complexity of the JSF "update" route, I think that this would allow more room for things to break. For example, if you made an XML transformer, it could target a subset of controls but emit standard comments with TODOs to cover the parts it doesn't handle. That wouldn't be perfect, but you'd end up with real-world-compatible code without having some sort of intermediary translation layer like keeping the old components would essentially be.

This would probably have to involve porting over the SSJS EL extension and thus retaining support for various uncomfortable XPages- and Domino-isms, but them's the breaks, I suppose.

Focus on REST APIs Only

This is the route where we would basically wash our hands of traditional XPages applications (minus bug fixes) and instead target writing only REST services, whether it be in the NSF, in plugins, or in normal JEE apps. This has the advantage of being easiest for HCL, since it more or less works (though the ExtLib's use of Wink holds back the JAX-RS version significantly), but still would mostly be a "rewrite all your applications" route. The only salvaged code would be anything that's already cleanly separated in Java or SSJS, and I suspect that that's not the bulk of it.

Progressing Without a Defined Path

In the mean time, aside from personally becoming acquainted with other technology, I think it behooves all of us with actively-maintained XPages apps to step up the progress on making them portable. I've been doing this heavily for one of my client projects, and I'll have more specifics to say about that in the future. Some parts are straightforward and have remained good advice for a long time: don't use SSJS, do adopt Jakarta EE technologies, and adopt automated builds (including for your non-XPages NSFs).

The specifics get a lot more complicated, unfortunately. Since we've been swimming in the same XPages pond for a decade, even mostly-clean Java code is likely to be infested with XPages-isms, both out of habit and out of necessity. For example, hooking up file uploads to a Java bean requires using an XPages-specific class, which barely transfers to OSGi-based servlets, let alone any other environment. And there are tons of little things, like using XSPContext to get URL parameters. It's going to be a messy process, but I think it will be necessary for any apps that you plan to keep using for more than a year or so.

I'll probably end up turning this into a series, where I'll discuss the various hurdles I've overcome in taking a complicated XPages apps and gradually laying the groundwork for a different UI technology.

And, in the mean time, if you're working with large active XPages applications, hit me up on Twitter and let me know how you've gone about making them. I realized earlier that, while I certainly have detailed knowledge of how people could write XPages applications, I don't have a good bead on how many actual XPages apps of each stripe exist and what the prevailing methods still are.