Better - But Fiddlier - Conversion of DateTime Objects

Wed Sep 03 10:20:22 EDT 2025

Tags: java jnosql

Among the many classes in the lotus.domino API, DateTime has always been particularly cursed. Not quite as cursed as its brother DateRange, which has never been - and still is not - readable from views and is only barely reconstructable in documents, but still rather cursed.

The Curse

For one, because it shares the trait with most other objects of having a backing C++ object that needs to be explicitly freed, it's important to make sure they're recycled regularly. But they immediately throw you a curveball in that they're children of Session and not the Document or ViewEntry they're pulled from, making it highly likely that you're going to leave them floating in memory for longer than expected when you do a loop. Moreover, they're likely to show up in viewEntry.getColumnValues() even if you don't care, so you have to make sure to studiously call, say, viewEntry.recycle(columnValues) in your loop too - and how many of us remember to do that? And all that hassle just for 64 bits of data total.

To make matters worse, DateTime had the bad luck of being paired with the famously bad java.util.Date class, which is itself a weird wrapper around essentially just an Epoch timestamp and riddled with early-Java-era bad decisions. It's also a very poor representation for a Notes time value, since it can only represent a full date+time pair and has no concept of a time zone or offset. The remaining Notes-isms are left to methods like getDateOnly() and getTimeOnly() that return Strings. While an American may not worry about what "A string representation of the date part of the time-date" implies, that vagueness should send a shiver down the spine of anyone more worldly.

Better Ways

The thing is, though, that the Notes time format actually matches up pretty closely with three of the java.time classes added in Java 8: LocalDate, LocalTime, and OffsetDateTime. Though these classes are now over a decade old, IBM and now HCL have not added support for them in the lotus.domino API. JNX does, though, and the results are much nicer than dealing with DateTime. The trouble is that JNX can do this by way of accessing the two 32-bit integers that make up the "Innards" property of the TIMEDATE struct, which isn't accessible on DateTime.

Well, not properly accessible, anyway. I got a wild hair to improve the use in the NoSQL driver yesterday, so I started digging. Immediately, I noticed that the lotus.domino.local.DateTime concrete class - the one used when not using CORBA or whatever "cso" references - has two int instance members named "mInnards0" and "mInnards1". Well, that looked promising! My first attempt was to just get those values using reflection, but they were still set as 0 - that makes sense, since all the actual methods in that class go to native code that presumably accesses the C-side value.

Those properties are referenced in some methods to do with "restoring" the object, which is not really a concept in the API in general, but I figured it could be like how there seems to be some auto-GC code floating around in Notes.jar that doesn't really come into play. And, by gum, that's exactly what it is: for some reason, these values are written when you call recycle() on the object. I don't know if this restoration mechanism is actually used or useful, but I don't really care; those two innards values are everything I need.

Putting It To Work

So I snagged the code from JNX that handles this and dropped it into the NoSQL driver, and now I can have a path that handles lotus.domino.local.DateTime objects specially, recycling them and then reading the now-set innards values to convert to near-perfect java.time representations.

For the driver, this has some big benefits. Under profiling, when fetching 1000 view entries with a couple time values, calls to toTemporal took up 560 ms (about 20% of the total time spent), while now it takes 64 ms (now 2% of the total time). Moreover, not only have I now eliminated on of the bigger performance sinks, the result is better: now you can get an OffsetDateTime that actually represents the offset of the stored time. While 3 AM Eastern Daylight Time is technically the same moment as 7 AM UTC, the meaning of a document being created at 3 AM local time compared to 7 AM is potentially very significant.

I'm pleased as punch about this. Normally, relying on a weird side-effect and reflectively accessing non-public implementation instance methods is not a good idea, but I think it's fair to assume that this behavior is not going to change any time soon. I'll still likely convert the NoSQL driver to use JNX eventually, but this honestly really lowered the priority: I'm able to get to most other things I want in a fast-enough way, and these date/time values were the main thing where it was outright deficient.

CDI Events in Action: DQL Explains in the JNoSQL Driver

Fri Aug 29 14:40:12 EDT 2025

Whenever I want to explain what CDI is to somebody who is unfamiliar with it, I start with "managed beans but better". It definitely is that - just using annotations to define scopes and @Inject to reference beans in code makes them much better than the ancient mechanism in XPages.

However, CDI definitely goes far beyond that, and I had use of one of its capabilities today: the event publish/subscribe system.

For a good while now, I had a task to add DQL "explain" logging to the NoSQL driver in the XPages JEE project. This would be useful in troubleshooting performance of the queries generated by NoSQL's mapping of method names like "findByFirstName" to the actual calls. The explain results could be used to determine where adding index views could speed things up.

The trouble with this was that it's never been clear how it should be logged. I could just use a normal java.util.Logger instance, but that would make the developer have to check the XML-based logs on the filesystem, which would be annoying for several reasons. I also didn't want to restrict it to just being something to go to a log, so I decided to instead use CDI events to publish the data when configured to do so. The mechanism to do this is in the project README, but I figure it's also a good example of using CDI's extended capabilities.

Publishing Events

Events themselves are arbitrary Java objects of any type and are published using the confusingly-named jakarta.enterprise.event.Event interface. For this purpose, we'll use a Java record as our event type:

1
2
public record MyEventClass(String payload) {
}

In a normal bean, you can get a handle on these emitters by declaring @Inject Event<MyEventClass> emitter in one of your beans. If your code is outside an existing bean (as is the case in my driver), you can call CDI.current().select(new TypeLiteral<Event<MyEventClass>>() {}).get() to get the same thing.

Once you have this, you can emit events by passing event objects to it:

1
emitter.fire(new MyEventClass("this is my payload"));

Receiving Events

Once you have code that emits events, the other side is to listen for them. This can be done by having any bean that has a method where a parameter is annotated with jakarta.enterprise.event.Observes. For example:

1
2
3
4
5
6
@ApplicationScoped
public class ListenerBean {
	public void eventHandler(@Observes MyEventClass event) {
		System.out.println("I received an event: " + event);
	}
}

And that's it - when an event of this type is fired anywhere, your bean will hear about it without needing any knowledge of the sender. In the case of the NoSQL driver, this means that developers can listen for ExplainEvents and log or otherwise process them as desired. That also means that my code doesn't have to make any assumptions about the results are to be used.

As is often the case, a publish/subscribe system like this can be a very potent tool, and you can go a lot further with this, writing your applications to be a lot more event-based. You could go TOO far with it, of course, but judicious use can make your code a lot cleaner, more explicit about what's going on, and more extensible for future needs.

Jakarta REST Tip: Custom Path Param Resolvers

Mon Jul 21 19:07:09 EDT 2025

When you're designing a REST API or MVC app of more than basic complexity, you're likely to end up with a lot of different endpoints with the same path components. For example, in the new OpenNTF home app, projects have a lot of sub-pages with URL paths like:

  • projects/{projectName}
  • projects/{projectName}/releases
  • projects/{projectName}/releases/{releaseId}
  • projects/{projectName}/screenshots
  • projects/{projectName}/screenshots/{screenshotId}

This will likely start with a class like this, using Jakarta Data to get your model objects:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
package controller.projects;

import jakarta.inject.Inject;
import jakarta.mvc.Controller;
import jakarta.mvc.Models;
import jakarta.mvc.View;
import jakarta.ws.rs.GET;
import jakarta.ws.rs.NotFoundException;
import jakarta.ws.rs.Path;
import jakarta.ws.rs.PathParam;
import model.projects.Project;

@Path("projects")
@Controller
public class ProjectsController {
	@Inject
	private Project.Repository projectsRepository;

	@Inject
	private Models models;

	@Path("{projectName}")
	@GET
	@View("project/summary.jsp")
	public void showProject(@PathParam("projectName") String projectName) {
		String key = projectName.replace('+', ' ');
		Project project = projectsRepository.findByProjectName(key)
			.orElseThrow(() -> new NotFoundException("Could not find project for key " + key));
		models.put("project", project);
	}

	@Path("{projectName}/releases")
	@GET
	@View("project/releases.jsp")
	public void showProjectReleases(@PathParam("projectName") String projectName) {
		String key = projectName.replace('+', ' ');
		Project project = projectsRepository.findByProjectName(key)
			.orElseThrow(() -> new NotFoundException("Could not find project for key " + key));
		models.put("project", project);
	}

	// And so forth
}

That's fine and all, and you can scale it out as you add more "sub"-resources by having another controller with e.g. @Path("projects/{projectName}/releases"). But even in the basic case, there's a clear problem: I've repeated the same logic of getting the project based on the name parameter, and that will be repeated in almost every method in this and related resources.

This can be alleviated a bit by moving @PathParam("projectName") String projectName to the class level and having a utility method to do the lookup, but that's not quite ideal. What we really want is a way to encapsulate the meaning of this translation in a way that can be used transparently any time we need it.

Path Param Converters

By default, path params are naturally strings, but the REST framework knows how to convert them to basic values - "1" to integer 1, "true" to boolean true, and the like. This is an extensible mechanism, though, and so we can teach our app to convert project names to Project model objects.

To do this, we'll make a new class that implements jakarta.ws.rs.ext.ParamConverter:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
package rest.ext;

import jakarta.enterprise.context.ApplicationScoped;
import jakarta.inject.Inject;
import jakarta.ws.rs.NotFoundException;
import jakarta.ws.rs.ext.ParamConverter;
import model.projects.Project;

@ApplicationScoped
public class ProjectParamConverter implements ParamConverter<Project> {
	
	@Inject
	private Project.Repository projectRepository;

	@Override
	public Project fromString(String value) {
		String key = value.replace('+', ' ');
		return projectRepository.findByProjectName(key)
			.orElseThrow(() -> new NotFoundException("Unable to find project for name: " + key));
	}

	@Override
	public String toString(Project value) {
		return value.getName();
	}

}

We make this a CDI bean so that it can participate in the whole CDI environment and get our data repository. Here's where we put our logic, allowing us to centralize what it means to go from a string to a model object. We also provide a way to convert back, for whatever the framework uses that for.

Just creating this as a bean isn't quite enough - we still need to tell the REST runtime about it. To do that, we create a class that implements jakarta.ws.rs.ext.ParamConverterProvider:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
package rest.ext;

import java.lang.annotation.Annotation;
import java.lang.reflect.Type;

import jakarta.inject.Inject;
import jakarta.ws.rs.ext.ParamConverter;
import jakarta.ws.rs.ext.ParamConverterProvider;
import jakarta.ws.rs.ext.Provider;
import model.projects.Project;

@Provider
public class ProjectParamConverterProvider implements ParamConverterProvider {
	
	@Inject
	private ProjectParamConverter projectParamConverter;

	@SuppressWarnings("unchecked")
	@Override
	public <T> ParamConverter<T> getConverter(Class<T> rawType, Type genericType, Annotation[] annotations) {
		if(Project.class.equals(rawType)) {
			return (ParamConverter<T>)projectParamConverter;
		}
		return null;
	}

}

With the @Provider annotation, REST will pick up on this and ask it for a converter whenever it encounters a type it doesn't inherently understand. For now, we cover just Project, but we can expand this as more types come into play (for example, project releases, which will also be in path parameters).

Putting It To Use

With this in place, we can go back to our ProjectsController class and clean it up:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
package controller.projects;

import jakarta.inject.Inject;
import jakarta.mvc.Controller;
import jakarta.mvc.Models;
import jakarta.mvc.View;
import jakarta.ws.rs.GET;
import jakarta.ws.rs.NotFoundException;
import jakarta.ws.rs.Path;
import jakarta.ws.rs.PathParam;
import model.projects.Project;

@Path("projects")
@Controller
public class ProjectsController {
	@Inject
	private Models models;

	@PathParam("project")
	private Project project;

	@Path("{project}")
	@GET
	@View("project/summary.jsp")
	public void showProject() {
		models.put("project", project);
	}

	@Path("{project}/releases")
	@GET
	@View("project/releases.jsp")
	public void showProjectReleases() {
		models.put("project", project);
	}

	// And so forth
}

That's noticeably cleaner! It also means that our controller layer now has less to know about: if we change how projects are looked up (say, allowing UNIDs as well as names), this class doesn't need to change. The code is also much more explicit - we don't really care about the project name as such, but rather just that we have a URL that includes whatever the identifier for a project is. Someone entirely new to Jakarta REST may not know how that translation happens, but, once you know how it works, it's much clearer.

Figuring this out reinforced an important lesson for me: whenever I find that I'm building something out in a way that feels gangly, there's usually an idiomatic solution to the problem that improves code readability and maintainability. It's one of the advantages to using a mature and broad-scoped framework like this.

Quick Tip: Deploying Static Resources As A Webapp

Sun Jun 29 13:38:14 EDT 2025

Tags: domino osgi

The other day, I had occasion to deploy a newer Dojo version to Domino than what was present - a later version fixed some layout problems in a chart, but the client isn't ready to update to Domino 14.5. Since this wasn't an XPages app, I didn't need to worry about incompatibilities or the XPages-specific extensions, but there was still the matter of getting Dojo's giant file tree served up by Domino.

Normally, this isn't a big deal - a JavaScript library can usually be plunked in to the NSF without issue. But Dojo's tree is still big enough that it will break the DB's design. I could probably fix that by pruning the directory, but I didn't want to spend that much time on the task. It'd also have the disadvantage that Domino would complain about any "?" URLs that don't start with "?Open", which is a drag.

The next option is to copy the files to the data/domino/html directory. That would work, but it kind of sucks: now you have files not represented in NSFs, and one more thing for admins to keep track of and for developers to forget. So that's an option of last resort.

Another approach I've taken in the past is to make an OSGi plugin that contributes an ExtLibLoaderExtension. That lets your plugin contribute resources to the ExtLib's own resource provider. The big advantage here is that, if you write your code right, they can participate in the minifier. However, since this isn't an XPages app, that won't realistically do me any good, and it'd also involve writing some fiddly custom code that we'd have to know about forever.

Wrapper App

But then I came up with another route: I could make a tiny little web app using the com.ibm.pvc.webcontainer.application extension point solely to host these files. Normally, you'd use that point to make a complicated (ancient) Java EE web app, but this use doesn't even involve any Java code. I made mine in Maven, but even that is technically optional: the result just has to be a JAR (which is a ZIP file) with a couple files in it, plus whatever resources you want to serve. I'll use it here, though, since it takes care of some fiddly bits for us.

First up, you should have a "src/main/resources/plugin.xml" file with contents like:

1
2
3
4
5
6
7
8
<?xml version="1.0" encoding="UTF-8"?>
<?eclipse version="3.4"?>
<plugin>
	<extension point="com.ibm.pvc.webcontainer.application">
		<contextRoot>/dojo-1.17.3</contextRoot>
		<contentLocation>WebContent</contentLocation>
	</extension>
</plugin>

Here, "/dojo-1.17.3" is the path on the web server it'll be accessible as, while "WebContent" is where we'll be storing the files within our JAR.

Next up, we want a file named "src/main/resources/WebContent/WEB-INF/web.xml". This would normally list Servlets and other configuration for the web app, but we don't have any of that - it's just here as a stub:

1
2
3
4
<web-app xmlns="http://java.sun.com/xml/ns/javaee" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://java.sun.com/xml/ns/javaee http://java.sun.com/xml/ns/javaee/web-app_2_5.xsd"
	version="2.5">
</web-app>

Then, put any files you want to serve up in "src/main/resources/WebContent". We don't need to configure anything else to point at these, so they can be anything. Here, I put the "dojo", "dijit", and "dojox" directories there:

Screenshot of the project file layout

Finally, we'll have the "pom.xml" file at the root of our project to tell it how to build. This has a couple custom steps to the build, but mostly it could be copy-and-pasted into your own project, just changing the groupId, artifactId, and name to suit your needs.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
<project xmlns="http://maven.apache.org/POM/4.0.0"
	xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
	xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
	<modelVersion>4.0.0</modelVersion>
	<groupId>com.example</groupId>
	<artifactId>com.example.dojo</artifactId>
	<version>1.17.3</version>
	<name>Dojo Assets Wrapper</name>
	<packaging>bundle</packaging>

	<properties>
		<maven.compiler.source>1.8</maven.compiler.source>
		<maven.compiler.target>1.8</maven.compiler.target>
		<maven.compiler.release>8</maven.compiler.release>
		<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
		<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
		<osgi.qualifier>${maven.build.timestamp}</osgi.qualifier>
	</properties>

	<repositories>
		<repository>
			<id>artifactory.openntf.org</id>
			<name>artifactory.openntf.org</name>
			<url>https://artifactory.openntf.org/openntf</url>
		</repository>
	</repositories>

	<build>
		<plugins>
			<plugin>
				<groupId>org.apache.felix</groupId>
				<artifactId>maven-bundle-plugin</artifactId>
				<version>5.1.2</version>
				<extensions>true</extensions>
				<configuration>
					<excludeDependencies>false</excludeDependencies>
					<instructions>
						<Bundle-SymbolicName>${project.artifactId};singleton:=true</Bundle-SymbolicName>
						<Automatic-Module-Name>${project.artifactId}</Automatic-Module-Name>
						<Export-Package />
						<Require-Bundle />
						<Import-Package />
						<Bundle-RequiredExecutionEnvironment>JavaSE-1.8</Bundle-RequiredExecutionEnvironment>
					</instructions>
				</configuration>
			</plugin>
			<plugin>
				<groupId>org.openntf.maven</groupId>
				<artifactId>p2-maven-plugin</artifactId>
				<version>2.2.0</version>
				<executions>
					<execution>
						<id>generate-site</id>
						<phase>install</phase>
						<goals>
							<goal>site</goal>
						</goals>
						<configuration>
							<featureDefinitions>
								<feature>
									<id>${project.artifactId}.feature</id>
									<version>${project.version}</version>
									<label>${project.name}</label>
									<providerName>${project.groupId}</providerName>
									<description>${project.name}</description>
									<artifacts>
										<artifact>
											<id>${project.groupId}:${project.artifactId}:${project.version}</id>
											<transitive>false</transitive>
										</artifact>
									</artifacts>
								</feature>
							</featureDefinitions>
						</configuration>
					</execution>
				</executions>
			</plugin>
			<plugin>
				<groupId>org.darwino</groupId>
				<artifactId>p2sitexml-maven-plugin</artifactId>
				<version>1.2.0</version>
				<executions>
					<execution>
						<phase>install</phase>
						<goals>
							<goal>generate-site-xml</goal>
						</goals>
						<configuration>
							<category>${project.name}</category>
						</configuration>
					</execution>
				</executions>
			</plugin>
		</plugins>
	</build>
</project>

Once you do a mvn install for that, you'll end up with a repository in "target/repository" with a "site.xml" file suitable for importing into an update site NSF.

This need doesn't come up too frequently, since it's rare that I have to deploy this many files at once, but I'm pleased with this technique. That update site can be replicated around, and the presence of it in the notes.ini means that it's much more self-documenting than dropping the files on the filesystem. Plus, the fact that involves no custom executable code or weird Domino dependencies makes it very portable to build anywhere. It's something for me to keep in my back pocket for the future.

Developing Jakarta Apps Via NSF File Server

Wed Jun 25 16:42:37 EDT 2025

In my previous post, I made an offhanded comment towards the end about how the tool was really for static files and, while it'd be technically possible to deploy Java code this way, it'd be awkward and weird.

But that got me thinking: just how awkward and weird are we talking here? So I set out to trying to see. An app that is all-in on Jakarta EE is basically a normal old webapp housed in an NSF, and the structure of the "WebContent" directory is the same as a WAR: visible files go in the top level, JARs go in "WEB-INF/lib", and compiled classes go in "WEB-INF/classes". Normally, the Code/Java design notes in an NSF are those compiled class files, doing double duty to hold the source and the bytecode in the same note, but that's not the only way. If you have a file resource that calls itself "WEB-INF/classes/foo/Bar.class", it'll count as a class in the app. This is how the pre-8.5.3 method of adding Java classes worked: Designer would make one file resource for the source file and another for the bytecode.

Trying It Out

So, since a running app doesn't care about the source, in theory the presence of just the .class files in the right place should suffice for deploying an app. I decided to put the theory to the test and, lo and behold, it worked:

Screenshot of Java bytecode being uploaded via scp to an NSF and a REST service working

Those are the classes from the Jakarta EE starter project, which gives you a Maven project with a JAX-RS resource that emits JSON. The XPages JEE project creates an environment pretty comparable to a normal Jakarta EE server, so the compiled classes work just the same when copied around.

So far, so good, but what about a more-complicated example? A real app will have non-class Java resources (translation properties files, for example), web resource files (HTML, CSS, JS), and dependency JARs. So I added commonmark as a dependency to my project via Maven and built the project. I found that, with a Maven project, the entire structure of your result WAR is present in "target/your-artifact-id", and thus I switched to copying that up:

Screenshot of a more-complicated app in Eclipse uploaded to and running in Domino

This actually scratches an itch I've had for a long time: I've considered the notion of having a mode in the NSF ODP Tooling project that takes a WAR and makes an NSF out of it in this format. As it turns out, the result of that would be the same as this, and this allows for faster iteration on testing changes.

Some Caveats

One major limitation not so much of the tools but of the underlying protocol is that "recursive delete" isn't inherently a thing in SFTP, but it's necessary to make sure that any files from an old build not in new builds (renamed classes, etc.) are deleted when deploying a new version. Since rsync would take a lot of work, I've been looking at lftp, which may fill the gap, but it'll take some fiddling. Dedicated file-transfer GUI tools like Transmit could help too.

Additionally, while it's possible to create a compatible classpath in a Maven project by including the core JEE APIs and some Domino-specific things like the NoSQL driver as dependencies, this route wouldn't help you for things like creating or modifying views. Those don't change as frequently as app code, though, so it'd probably be fine to have Designer (or Nomad Designer) around for that task in much the same way as one might use a database management app in other environments.

It's also a longer cycle between making a change and seeing it than when using Designer, since you'd have to make sure to do a local build and then copy the files up, which will scale in time with the complexity of your project. That may be a fair price to pay to have your choice of IDEs and the ability to use Maven for dependencies and processing, though.

Anyway, this is all pretty neat. I've really gotten into a good groove of app dev with the JEE project lately, but I won't pretend I wouldn't mind a nicer development environment. I think I'll have to try using this for some of my projects for a bit to find out if it'll work in general.

Integrating VS Code (And Others) With Domino Via NSF File Server

Wed Jun 25 13:46:23 EDT 2025

Earlier today, Richard Moy posted a neat blog post on a mechanism his team developed to be able to work on HTML, CSS, and JS in Visual Studio Code and sync the results to an NSF, allowing them to use more-modern tools while still hosting in an NSF.

That got me thinking about the NSF File Server project and its support for storing files in "WebContent" in an NSF, and how I could write about how it could accomplish a similar thing. After testing it and finding a bug to fix with served MIME types, I uploaded 2.1.0 and now I'm ready to talk about it!

Quick Intro

The current form of the project came out a bit over a year ago, and I posted about the various new features. The important one for our needs is the ability to use the conceptual "WebContent" directory in the NSF to house static files. As a note, though this directory usually comes up in the context of XPages/Java development, it's not really tied to them - it's just another type of File Resource.

The way the project works is that it's an OSGi plugin and thus best deployed in an Update Site NSF referenced via notes.ini. There's also a config NSF to create - the NTF is in the distribution as "fileserverconfig.ntf" and should be created on the server as "fileserverconfig.nsf". This DB can be replicated around your domain, since all of the configuration options are or can be server-specific.

Once the NSF is in place, create a document in the "Server Configurations" view for your server to set the port it will listen on. Then, create a document in "Mounts" to point to the NSF you want to work with, giving it an appropriate name and putting the NSF path in "Data Source Path":

Screenshot of the Mount configuration document UI

You'll probably also want to tweak your Domino Directory NSF to add a text item named "sshPublicKey" to house, well, each user's SSH public key (the kind you get from ssh-keygen). In my environment, I added this (and a bunch of other stuff) to the "$PersonExtensibleSchema" subform as a multi-value text item:

Screenshot of an sshPublicKey item in names.nsf

You can also use password-based authentication, but you shouldn't.

Once it's all set, restart HTTP, since the server's lifetime is bound to HTTP.

Testing It Out

Before trying any fancier integration, it'll be a good idea to test using a simple SFTP client. The simplest is the sftp tool that comes with the SSH suite on basically every OS:

1
sftp -v -i ~/.ssh/id_rsa -P 9022 "CN=Jesse Gallagher/O=IKSG@your.domino.server"

The username should be something recognizable to Domino - I'm sure variations will work, but I like using the full DN by default to eliminate ambiguity.

Once connected, you should see your mounted DB as a folder. You should be able to cd into it and ls to list the contents. In a new NSF, you'll see the WEB-INF directory Designer creates, and you'll also see anything you put in there intentionally. If you have a local file, you can put foo.txt to try uploading and deleting it:

1
2
3
4
5
6
7
sftp> cd appexample
sftp> put license.txt
Uploading license.txt to /appexample/license.txt
license.txt                                                                            100%  589   184.5KB/s   00:00
sftp> rm license.txt
Removing /appexample/license.txt
sftp> 

If all is set up properly and you have Designer access to the NSF, you should be good to go.

Configuring VS Code

In my tinkering, I used the simply-named SFTP plugin for VS Code, which lets you configure a directory to sync automatically with a remote SFTP server. Following the examples, I configured my .vscode/sftp.json config file like:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
{
    "name": "Your Server",
    "host": "your.domino.server",
    "protocol": "sftp",
    "port": 9022,
    "secure": true,
    "username": "CN=Jesse Gallagher/O=IKSG",
    "privateKeyPath": "/Users/jesse/.ssh/id_rsa",
    "remotePath": "/appexample",
    "uploadOnSave": true,
    "ignore": [".vscode", ".git", ".DS_Store"]
}

Once I did that (and re-opened the dir in VS Code to be safe, though I don't know if that was needed), things started syncing on save:

Screenshot of a file edited in VS Code visible in Designer

In that screenshot, you can see the files in WebContent using Designer's Project Explorer view, but that's entirely optional - at this point, you're free to uninstall Designer if you so desire, since VS Code will get the files in there itself thanks to the SFTP server. Your files will show up at URLs like https://your.domino.server/yourapp.nsf/baz.html.

Other Uses

Though I focused on VS Code as an example, there's nothing special about it for using the SFTP server. Since it's a generic protocol, you could also use any other IDE with similar support, or any other SFTP client that can do the syncing for you (rsync doesn't work currently, since it requires its own protocol tunneled over SSH). For example, you might have a CI/CD server like Jenkins do the SFTP for you at the end of a build, so you can see files automatically deployed to Domino. You can also use whatever the heck OS you want as long as it has an SFTP client.

What this won't do (right now) is any kind of non-static programmatic elements like forms, views, agents, or Java elements. I guess you could in theory compile Java locally and push the results to WEB-INF/classes and WEB-INF/lib, which should work, but that would be an odd way to develop. This is primarily of use for writing or deploying static files, either manually edited or as built by a (freaking) JS transpiler toolchain.

It'd also be possible to do much the same thing with other storage "backends". The "NSF File Store" built-in type, for example, uses data documents with the form and views in the "filestore.ntf" template, storing the file data as attachments. One could use that template directly or make compatible views in another app and use that for deployment without requiring giving anyone Designer access to an NSF. It'd also be possible to write an entirely-different storage mechanism, but that'd require the fiddly task of implementing the VFS in Java, so it'd be for dedicated developers only. I think the WebContent mode would do the job nicely in most cases regardless.

In any event, give it a shot! I think it's a neat tool and likely has a potential place in a lot of workflows.

Notes/Domino 14.5 Fallout

Tue Jun 17 09:06:49 EDT 2025

  1. Oct 19 2018 - AbstractCompiledPage, Missing Plugins, and MANIFEST.MF in FP10 and V10
  2. Jan 07 2020 - Domino 11's Java Switch Fallout
  3. Jan 29 2021 - fontconfig, Java, and Domino 11
  4. Nov 17 2022 - Notes/Domino 12.0.2 Fallout
  5. Dec 15 2023 - Notes/Domino 14 Fallout
  6. Sep 12 2024 - PSA: ndext JARs on Designer 14 FP1 and FP2
  7. Dec 16 2024 - PSA: XPages Breaking Changes in 14.0 FP3
  8. Jun 17 2025 - Notes/Domino 14.5 Fallout

With Domino 14.5 out, it's time for me to do my now-traditional post about the release, covering the pertinent changes and bugs I've hit in the release version.

The good news with this release from a Java perspective is that it's pretty smooth sailing. Since Domino 14 already took the big bite of moving past Java 8, the move from Java 17 in 14 to 21 in 14.5 is not really disruptive at all. Hopefully, this trend will keep up - moving from one LTS release of Java to the next is much smoother than jumping over one, and admittedly moving from 8 to anything was more disruptive than any subsequent version bump. My hope is that Domino will remain more-or-less on top of current Java LTSes. The next one will be Java 25 in September, so Domino is current (LTS-wise) at the moment.

Most of the Java features we care about are of the quality-of-life type, which is great. Java has been on a real "let's make developers happy" kick lately, and I'm all for it.

Eclipse also got a bump from 2021-12 to 2023-12, which, while a year and a half old now, is nice. I'd like to see this track closer to current Eclipse, but this is at least much, much better than the bad old days of the 9.x era. I've had some luck installing some recent-ish upstream Eclipse plug-ins that I may make into an OpenNTF project, though some of the ones I want to use require a version newer than 2023-12.

Really, all the notes from my post about Domino 14 still apply here. There aren't any major new hurdles or workarounds to be aware of, so I can skip the main topic and focus just on the little things to know.

Domino

Domino got a bunch of new features, though most are outside my bailiwick. There are a few niceties that show up in my regular work that I'd like to mention, though.

Adminp Signing With One-Touch Setup

When you use One-Touch Setup for a Domino server, you can tell it to sign any DBs you create with it with adminp. I use this all the time for my test suites, since XPages and Java code needs to be properly signed to run. In previous versions, this would appear as tons of lines like "Adding sign bit to item $LargeSummary" on the console. Since my test suite has to wait for this to finish to run, I had to have the code scan for some number of these - while presumably predictable, I found it most practical to just pick a number that worked and go with it.

In 14.5, it will not emit "Database signed by Adminp: foo.nsf", which is much cleaner and also much more useful, since now I can check for the specific count of NSFs that it's signing. This is a huge relief.

Wink Chattiness is Fixed

Speaking of unnecessary console output, one of the things that has bugged me in recent versions is that JAX-RS resources registered with Wink spew a lot of INFO-level junk onto the console with no way to tell it otherwise. This was actually always the case with the Wink version in Domino, but it became more annoying once Verse (which still uses that old JAX-RS version) started coming pre-installed, leading to messages as soon as the first Java request came in every HTTP start. In 14.5, this is fixed. Whoo!

Designer

Most of the enhancements in Designer come from improvements to Java and the underlying Eclipse editors. One nicety - and I forget how much this was the case in 14 - is that Designer is making use of the Language-Server-based editors for JS and CSS more and more with each version, much to our benefit. The age of the Eclipse base means we don't get everything in newer versions (for example, the CSS editor doesn't know about CSS Nesting), but it's still a step up.

Modern JavaScript Syntax

Designer has adopted these LSP-based editors in most places where you edit client JavaScript. This is a great improvement, as the editor is much better and recognizes modern JavaScript syntax. However, while the editor works with it, editing a JavaScript Script Library design element doesn't let you save with legal syntax like let foo = (bar) => {}. If you try that, you'll get an error dialog (three times) saying there's a syntax error.

The workaround I've found for this is to write my JavaScript libraries as File Resources instead, which use the same newer editor but don't have the on-save old validation. These files presumably won't show up in GUI places that list script libraries, but that's a small price to pay.

ndext Annoyance

Unfortunately, the post about Designer's "ndext" behavior in the FPs still applies. Designer will still repeatedly re-add all the crud from the "ndext" directory to the main JRE on every launch, including the traitorous "jsdk.jar", so be on the lookout for that.

It seems like you can work around this by making a second JRE entry in the "Installed JREs" prefs, cleaning it up, and making it your default. Designer shouldn't mess with that one on launch.

Another way you could work around that is to move to the XPages JEE platform, since the current version of the Servlet classes in that project don't conflict with either of the primordial ones in Domino.

Project Explorer Workaround

My preferred project view in Eclipse is the "Project Explorer" view, which is like the older "Package Explorer" in most ways but generally a bit nicer. For example, you can have it show Working Sets as top-level elements with twisties, which is very handy when you're working with a lot of projects.

Designer has this view too and it works the same way, but there's a small bug: when you first launch Designer, this view will be empty even if you have projects in your workspace. You can make the contents appear by clicking the little filter icon and then hitting OK without changing anything.


All in all, 14.5 is a very solid release. I've been using the EA versions since the start for normal day-to-day development and plan to continue to do so. The Java updates alone mean that this is all the more reason to leave 12.x and below in the dust.

New Release: XPages Jakarta EE 3.4.0

Sat Jun 07 14:57:04 EDT 2025

Tags: jakartaee

Today, I published version 3.4.0 of the XPages JEE project. In addition to the usual spate of bug fixes and refinements, this release brings two significant features: a bump to Jakarta EE 11 (mostly) and a new type of pure Jakarta app within an NSF.

Jakarta EE 11

Jakarta EE 11 has been in the works for a good while but is now mostly released. There remain a few components that are slotting into place, and some of the implementations remain in beta form, but they're in a reasonable state to put into a release. I like to keep this project up to date, and there are some nice improvements in here.

  • It officially adds Jakarta Data as a spec, though we've had that in the project for a good while now
  • CDI has a bunch of cleanup changes and now lets you organize producers by @Priority, which can be handy in some cases
  • Faces also has a handful of improvements, cleaning up the API a bit and making events more consistent
  • Pages (which remains my HTML generation tool of choice in new apps) got some long-needed cleanup
  • REST gained some new methods for working with headers and processing matched resources
  • Expression Language gained proper support for records and Optional (though the latter is off by default in the spec, it's enabled in this project)
  • Servlet gained some methods to work with Charset and ByteBuffer objects, which I've shimmed into the older ones Domino internally supports
  • MVC made a few good but breaking changes, removing support for Facelets (Jakarta Faces pages) as a view type and making CSRF enabled by default
  • Concurrency has some nice new features, though unfortunately the currently-released implementation doesn't yet support the nifty @Asynchronous annotation. That should come in the future, though

To go with these improvements, I put in a handful of refinements:

  • There's a new @DocumentConfig annotation you can use with NoSQL @Entity-annotated classes to declare what the actual form name should be. This is useful for cases where you have entities in the same app representing documents in different databases that happen to use the same form
  • There's a new org.openntf.xsp.jakarta.el.ext.ELResolverProvider ServiceLoader interface to customize handling of Expression Language resolution per-app
  • MicroProfile Metrics is gone, as it was removed upstream. In a future version, I plan to implement MP Telemetry, which is more flexible and has integrations with more services

Jakarta Modules

The largest feature addition, though, is what I've deemed "Jakarta Modules". The "module" term here is in reference to ComponentModule, the system Domino's HTTP stack uses to manage individual applications at the Java level. I've talked about this before, and the idea here is that a Jakarta Module is a peer to NSFComponentModule (the way NSFs are normally run with XPages) and OSGi*Module (the way Equinox Servlets and OSGI WebContainer apps are run).

These app types still use an NSF and you develop them the same way using Designer, but there are some key differences:

  • The entire application is treated like a "normal" Jakarta web app, which means that it has fewer old-to-new shims and workarounds
  • These apps are configured with a new database - jakartaconfig.nsf - that allows you to specify a web path to reference an NSF. So, for example, you configure "/someapp" to read its data from "/apps/someapp.nsf", and then all URL references will be "/someapp/your/app/url". This means that the URLs don't include ".nsf" and don't require having "xsp" somewhere in there like they do with traditional apps. This also means you can configure REST to use the app root as its own root, which is particularly useful when using MVC
  • XPages and legacy design elements like forms and views are not supported for web access. While you can still freely use views programmatically, they aren't accessible by URL within the module's space. Web-friendly elements like file resources, style sheets, and images work as they do with traditional apps
  • These apps are initialized at HTTP start, which means you can reliably use things like ServletContextListener and CDI beans with methods like void init(@Observes Startup) to do app pre-load. CDI beans like that will have a NoSQL context running as the DB signer, so you can use those to load configuration data from the NSF, for example
  • The ability to hook into HTTP init also means you can use ManagedScheduledExecutorService for a predictable replacement for scheduled agents that run entirely in your app space
  • This also allows reliable use of HttpSessionListeners and ServletRequestListeners, which don't quite have proper hooks in XPages apps

It's possible to develop an application that will work in both "modes", and that's how I've written the test cases for this: I made it so that the non-XPages tests for most features will now access the NSF both in the traditional way and in this new way. Doing this means avoiding XPages-specific classes like ExtLibUtil and NotesContext, and I have some documentation that covers a few of the aspects for writing "portable" apps. In general, though, I imagine it'll be the case that you'll want to pick one type and go with it, since there's no particular reason to access the app both ways.

Since this feature is a complex undertaking, it's more likely than average to contain bugs, so I'd appreciate as much testing as possible. I plan to move a couple apps over to this format - I decided to bite the bullet for Magpie's sake, since I wanted scheduled tasks, but I also plan to dogfood it with other things I work on. I also expect I'll have some followup blog posts about it, as I had to solve a number of interesting technical problems to get it to this point.

Future Enhancements

With this one in the bag, I have some plans for future versions. For one, I've been lining up desired features for Jakarta Modules, but beyond that I'll want to bump the remaining JEE 11 laggards when their implementations are out, and there's the aforementioned MicroProfile Telemetry. I'd also at some point like to give the same Jakarta Module treatment to more-traditional webapps, potentially attaching a WAR file in jakartaconfig.nsf and using that. I imagine most of the code would be shared between the two - a lot of the NSF-specific code is encapsulated in a few classes, while all the app-lifecycle bits are independent of the actual storage of classes and resources. That'd be a fun one to do.

Magpie

Sun Apr 20 15:17:11 EDT 2025

The app icon for Magpie, designed by my wife

A few weeks back, I started up a new spare-time side project: Magpie. It's an app to to provide a UI for downloading games you've purchased from GOG (and maybe other sites in the future) to keep them locally in Domino.

The goal is to have a self-hosted way to keep access to games you've purchased without having to rely on the future availability of them on someone else's servers. Domino's a perfect fit for that sort of thing, and this is an itch I've wanted to scratch for a while.

GOG

I've always liked GOG. Their original "Good Old Games" mission of dusting off old games to get them easy to run - both legally and technically - on modern systems has been a great boon to game preservation, and it's been easy to build up a big library of games. They've also always had a policy of making the games you buy DRM-free - even modern releases - and they offer offline installers in addition to their Steam-style GOG Galaxy app. It's this part that allows Magpie to be practical.

While I like GOG and fully expect them to keep doing what they're doing, it's become quite apparent especially lately that you can't rely on any company acting in ways you like indefinitely. Even if GOG itself stays "good", they can't guarantee anything for the future. For example, they used to sell Warcraft 1 and 2, but, when Blizzard released their remasters, they quickly had GOG remove their version from sale. If you purchased the GOG versions before then, you can still download them, but I wouldn't be surprised if that changes down the line. It's within Blizzard's rights to do this, but that's sort of the point: I don't want to rely solely on the continued good graces of a vendor to let me access a thing I bought.

Fortunately, GOG has an API... sort of. It's clearly designed just for their own app to use and you have to piggyback on a single client ID for tokens, but it ends up working, at least for now. It's not the sort of thing you'd build a commercial application on, but a for-fun project can be fine with that level of risk. With a token in hand, the API works well to list and access your owned games, and the download URLs will redirect over nicely to the offline installers and ZIPs of extra materials.

App Structure

Since this is a from-scratch clean project, I'm free to use the XPages JEE project exclusively and target Domino 14.5 with Java 21, and it makes it a joy to program. The remote API access can be clean MP Rest Client interfaces, I can map the JSON to record types, and I can use Concurrency executors to download files asynchronously.

Writing an app like this is also a big boon to the upstream project. There's no better way to test a framework than to actually do stuff with it, and this has helped me improve the Concurrency experience and find a couple bugs that hadn't cropped up in abstract tests.

The downloaded files end up getting stored as attachments in the NSF, which is... fine for now. It's no problem at all for a 13MB game like Ultima 1 and remains pretty reasonable for a 1GB game like Stardew Valley, but I'd start feeling pretty weird about putting a 300GB mountain of assets like Baldur's Gate 3 in there. I guess it'd probably work thanks to DAOS, but it's making me think I may want to add support for direct storage in your own S3-compatible bucket. That'd be a nice extension to the JEE project's NoSQL driver for sure.

Future Additions

Being a low-key side project, work on it is coming in drips and drabs, but I definitely have a bunch more that I want to do.

I've recently been working on integrating IGDB data (which does have an official API) to supplement the sparse game info from GOG to make the local library more pleasant and useful as it grows. I would love to add support for itch.io as a source of games, though the lack of API and the frequency with which you purchase games without logging in makes that one a daunting prospect. Along similar lines, the Internet Archive's software catalog is a natural fit - while they obviously intend to keep going, they have a lot of foes that could make that difficult in the future.

And, beyond that, this is also just a good playground to tinker with web stuff. I always challenge myself to keep the JavaScript to a minimum when I have the choice and I like to use newer-era HTML elements like <progress> when I can. I've been using the Milligram CSS "framework" as the baseline for the design, since it has a philosophy I like: style basic HTML elements so they look nice and coherent and don't force a lot of weird mangling of your code. It admittedly might be a little too sparse for a non-designer like me to make something attractive out of, but it's been serving me well in the scaffolding phase.

I'd also like to use this as a playground to push the boundaries of the Jakarta Concurrency support. I'm already using it to handle game downloads, but I'd love to have periodic background tasks to do things like check for updates to already-downloaded games and potentially auto-download new purchases that fit some criteria (e.g. "under 10 GB"). On paper, that should work, but the XPages stack is a hostile place for long-running code like that, so it'll be a good way to find the limits.

Anyway, this is a fun one. It's good to play around with these various APIs, Domino is a good conceptual match for it, and - at the risk of sounding too grandiose - I think it's a moral good to have more self-hosted tools like this. I'm looking forward to seeing how it evolves.

World of Warcraft's Silithid Quest Chain

Sun Mar 30 21:50:14 EDT 2025

Last year at this time, I took the opportunity at the end of Marchintosh to write a post about one of my favorite old Mac games, Realmz, and I'd figure I'd do similarly today. This one isn't really classic-Mac related, but there's a tenuous connection: Blizzard was, for a while, one of the best major game makers in terms of porting their games to the Mac.

The Tenuous Connection

In the 90s, gaming on the Mac was largely its own thing, for both general cultural reasons (Mac users loved these weird little shareware games) and market ones (it was an even smaller target market than today). In general, a popular game on DOS or Windows had little chance of showing up on the Mac. There were companies like Aspyr that specialized in porting games over to the Mac, and sometimes these ended up really nicely. I remember that the port of Doom got a nice resolution upgrade for its delay. It was pretty rare, though, that a company would keep step themselves.

For this era, Blizzard was surprisingly good about this. Warcraft 1 came out in late 1994 and had its Mac version come out only about a year and a half later. Warcraft 2, Diablo, and StarCraft kept a similar cadence, with the Mac port coming out a year-ish after the first release.

Beyond porting the games, Blizzard pulled this classy move:

Photograph of the multi-platform StarCraft CD-ROM

Once the Mac port came out, they started including it on the same discs as the Windows one, which was particularly useful for households like ours that had a split of platforms.

By the time they got to Diablo 2, the window had narrowed down to a month, and so their games started effectively coming out on both platforms at the same time. Blizzard also did a solid job of keeping pace with Apple's various platform shifts, adding OS X compatibility to their games from the era around when that OS came out. That's something that continues in World of Warcraft to this day: WoW has followed the Mac from 32-bit PowerPC, to 32-bit x86, to x64-64, and now to ARM, all basically right as the transitions happened, putting other game makers (and enterprise-software vendors) to shame.

Sadly, we're in a backslide period with this, but that's not important now. The point today, to get finally back to it, is one of my favorite memories from the original versions of WoW.

Early World of Warcraft

Though I had imagined I resisted the siren song of WoW for a long time, I remember that Maraudon had recently come out when I started and that Captain Placeholder was there, meaning that I caved somewhere between January 21 and March 7, 2005. My first long-lasting character was a Night Elf Hunter (of course), and I ended up in an all-Hunter guild. If you know anything about WoW, you can immediately see that, as thematic as the Nesingwary Safari Co. was, it was not really practical as a proper guild.

Eventually, I found that my boss also played WoW, so we picked a server to set up shop Horde-side. Correctly, I started a Troll Hunter who lives to this day, and in whose image my Classic adventures pretty much always start. He's also the guy who first experienced the quest line that I'm thinking of today.

Vanilla WoW's Quest Chains

As WoW's expansions have rolled out over the decades (!), their questing storylines have gotten pretty focused. Each zone will have its own story, and there will be side quests that are their own thing, but everything feels very intentional and coordinated. Each zone's plot informs the others, and the themes of the expansion are (for better and worse) pretty consistent.

Vanilla WoW, for a lot of reasons, didn't really work like that. Because it was comparatively early in the history of MMOs and because the people working on it didn't have it down to a science yet, things were more scattershot in a way that has a lot of charm. Some quests (especially in the early Human zones) are clearly the designers wanting to put classic D&D/RPG tropes into their new game - and those zones are also filled with little touches that betrayed that they spent more work there than in most places.

This continued through all of the zones in the game, which varied wildly in their focus. A lot of them were following up on threads from Warcraft 3: the Night Elves coming out of seclusion, the Horde scraping out a home in Durotar and The Barrens, the Forsaken waking in the ruins of their plagued homelands. Others, like the Human and Dwarf areas, picked up on threads from Warcraft 1, and also established their own new lore.

It's in those Human and Dwarf zones where you get one of the few long-form quest chains that vanilla WoW had to offer. The early zones offered an uncharacteristically-tight story that was (spoilers, I guess) picked up at max level when you delve into Blackrock Depths and eventually fight Onyxia.

The Silithids

Though less heralded (reasonably), the Horde had something sort of like this, but it wasn't as fleshed out. Presumably, this was both because it didn't need to be and because its crescendo didn't happen until a year and a half after launch. This is the one I want to talk about today, because I've always really liked how subtly it was woven into the zones that a young Orc, Tauren, or Troll was likely to quest through.

It starts pretty early, in the Crossroads. When you hit level 17, you'll be offered this quest by a Troll named Korran:

Screenshot of the "Egg Hunt" quest dialog from Classic WoW

It's pretty innocuous, especially when included in the torrent of quests you get around that point. It's also way down in the southern half of the Barrens, which took freaking forever to walk to, so you were likely to leave it languishing in your quest log for a while. And the quest itself is not particularly special: you go down there, click on the clickable items until you have enough eggs, and then head back north. Mostly of note is that you fought the nasty Silithid Swarmers, who were unusual for early-level enemies in that they kept a bunch of annoying adds with them. This was a clever way for the game mechanics to subtly reinforce the story.

Then, as was vanilla's way, things quieted down for a while. It's not until 10 levels later, when you're going to embark to the Thousand Needles, that Korran follows up with another quest, where his boss shares his growing concern:

Screenshot of the "The Swarm Grows" quest from Classic WoW

So far, these are the only two people talking about these insects (unless you're a Warrior) - all of your other quests are about local threats like the centaur, harpies, and so forth. Hints start dropping more once you're in the Thousand Needles, though. A Tauren named Hagar Lightninghoof gives you a quest to find a reported "alien egg":

Screenshot of the "Alien Egg" quest from Classic WoW

Then, as you're following up in the "The Swarm Grows" questline, you're sent to an abandoned Dwarven dig site in the southern Shimmering Flats. It's quickly obvious why it was abandoned:

Screenshot of the Silithid cave entrance in the Rustmaul Dig Site

There may have been some odd outcroppings when you dug up eggs in The Barrens, but this is a different scale entirely, a cave with oddly-organic protrusions in what's otherwise just a dig site in a salt flat. Things don't get any friendlier when you enter the cave:

Screenshot of the inside of the Silithid cave in the Rustmaul Dig Site

Still, while ominous and a bit gross, you hand in your bug parts and go about your day. Before too long, you'll make your way south to the neutral Goblin town of Gadgetzan, where you do all sorts of odd jobs for the locals. One - checking on the water supply to see if the local bandits are interfering - ends up with a surprise encounter with some more Silithid:

Screenshot of the "Gadgetzan Water Supply" quest from Classic WoW

There are some more encounters with the insects in Tanaris - including some horrifying lairs - and things start to pick up. That quest kicked off an eight-level-spanning chain that will eventually take you to the neighboring Un'goro Crater. You're also likely, around this time, to take a trip northwest to the verdant Feralas for a few levels. While a lot of the quests there revolve around the local gnoll and ogre populations, our invasive "friends" pop up again in the south of the zone (a recurring theme):

Screenshot of The Writing Deep in Classic WoW

You find similar hives in Un'goro Crater, and a clear pattern emerges: the further south and west you go, the more numerous and powerful these creatures become, and it is obvious that the small nest you found in the Barrens wasn't a fluke. The nightmarish depth of the problem becomes obvious when you, at about the level cap, finally arrive in the subtly-named zone Silithus:

Screenshot of the entrance to Silithus in Classic WoW

Ah, right, well then. You've grown in power alongside the Silithid you've found, but the ones here are much more powerful still.

As a neat meta-game note, the progression of this quest chain played out in the release process of the game. While major patches to the game's expansions almost always served specifically to advance the lore, vanilla was less consistent. Until the last few, most of the major patches were filling in things that were, from the perspective of the game's lore, always there. They'd rise to prominence because of some new focus, but presumably one could have in-universe gone to Maraudon or Dire Maul and found the same stuff before it was actually implemented in the game.

Silithus was a bit different, largely due to its original release state and its phased reconstruction. It originally consisted of basically a tiny camp with a flight point at the entrance and then a bunch of hives and cultists. Patch 1.8.0 brought a new quest hub and some actual story to the pieces, and it felt in lore that it wasn't just an implementation of something already there: the Silithid and the cultists supporting them were on the move. This came to full fruition in the next major patch, which added the raids and the most globally-immersive world event they've ever done.

The World

With that world event, every character - from the lowest levels to the peak raiders - had some part to play in the war effort. Something that started out as just a few leveling quests for baby Horde characters here ended up being something of global importance. And, with the pace of leveling in vanilla WoW, a player would likely be spending physical months having this doled out to them, so slowly that it didn't even really feel like a consistent tale the first time I went through it. But it is consistent, and in a way that feels unlike anything the game is set up to do now.

While I still enjoy current WoW, this is the sort of thing that makes me miss the vanilla days (even beyond the usual rose-colored glasses). The general unpolished feeling of this and other parts made the world feel a bit more real, and the fact that vanilla covered way more space and storylines than any expansion does meant that there was room for thin but long-form quest chains like this.

Fortunately, the prevalence of Classic realms means that it's mostly there to be played through now. It's lost the meta-game mechanical touch of the progressive rollout of patches, but the lower-level quests are all there and it'll still take you a long time in between them. If it's been a while for you, maybe roll up a Horde character in Classic and give it a shot. The last time I leveled (to get these screenshots), I was prepared to find these breadcrumbs and was delighted each time I saw one.

And, unlike the Alliance's precious The Missing Diplomat, this one has a worthwhile conclusion.