This research was supported by use of the ARDC Nectar Research Cloud, a collaborative Australian research platform supported by the NCRIS-funded Australian Research Data Commons (ARDC).

System updates and changes

2021 – ‘Testing and Consolidation’

A grant from the ARDC allowed us to continue development for another year.

The Gazetteer of Historical Australian Places (formerly ‘Placenames’) was identified as the centrepiece of TLCMap, based on user responses from grant partners and early adopters. With reduced and very limited development resources we focused on consolidating work around this application. Other systems while they should remain functional as standalone tools, are being integrated around the Gazetteer, where prioritisation and resources permit. Importantly the ‘Gazetteer’ is more than just a Gazetteer, with researchers incentivised to contribute data to get web visualisations and other features and to make information discoverable, it becomes a ‘crowd sourced’ cumulative spatiotemporal index of cultural information.

Although TLCMap work was delayed by unforeseen problems with other project commitments, the November upgrade paves the way for much wider ingest of data and promotion.

Some of the enhancements to the Gazetteer may not be immediately apparent, but establish a solid foundation from which to build in future (eg: establish unique identifier policy; handling of extended data in data import and export; handling of ‘placename’ and/or ‘title’, more forgiving handling of dates, and many other bug fixes and enhancements). With this established future work can focus on visualisations, analytics and handling of alternative spatiotemporal structures.

Many prototypes developed in 2020 are still highly desirable, but unfortunately, have to be sidelined pending future funding.

Recogito enhancements were ready for launch and testing at the end of 2021, but problems meant an estimated months more work is required to release this. Unfortunately, unforeseen interruptions meant this was not possible.

Features from the Spatiotemporal metrics prototype will be integrated into the Gazetteer. The Gazetteer back end has been modified to enable this.

2019-2020 ‘Speculation and Consultation’

The first year was made possible by an ARC grant funding a small team of part time developers. The first year was very speculative, consulting widely to obtain a broad understanding of spatiotemporal needs in humanities, assessing the state of the art, and initiating enhancements and prototypes to fill gaps in capabilities and prompt step changes in existing capabilities.

A rapid prototyping approach was taken to quickly identify, prioritise and adapt to user requirements, and to ensure functionality was developed in all areas.

With diverse areas of thematic interest from basic timelines to virtual reality and time based warping of space and 7 fundamentally different spatiotemporal structures identified for representing time in humanities, this lead to an endless stream of desirable features that needed to be prioritised and for which prototypes could be developed with limited time and resources.

This resulted in a some systems being identified as more successful and desirable than others, leading into a consolidation phase in 2021.

TLCMap is funded by a 1 year grant for rapid development to make it easier for humanities researchers to work with digital maps. This involves making existing functionality that is difficult or requires software developers easier as well as developing new functionality. Two key areas identified at the outset were ‘time’ (to view change over time, in various ways, not limited to timelines) and ‘layers’ (or ‘deep maps’). To help us provide what is most useful, please contribute to our survey.

This presents some challenges, such as how to find commonality among diverse and changing needs to obtain the most benefit for most people while still allowing for those idiosyncrasies which are so often essential to humanities, without trying to create a ‘one stop shop’ that ‘reinvents the wheel’ or tries to be all things to all people and fails to do anything well. We aim also to find the balance between infrastructure that is a ‘solution looking for a problem’ and projects that produce software that is not re-usable because it was designed only to get a research outcome for a paper. In research we often don’t know what we will find as we go and so adhering to a strict set of agreements and specifications from the outset, as is common in IT generally and in the commercial world, is not helpful.

The approaches we have adopted to deal with these complex problems are:

  1. Identify 6 broad themes which digital mapping in humanities activity generally focuses on. These are not mutually exclusive but help make sense of an otherwise confusing mass of projects and software.
  2. To develop with existing systems and standards that do what they do well.
  3. Identify gaps in functionality and develop in those areas.
  4. Identify gaps in functionality and develop enhancements to existing systems.
  5. Ensure all these systems work together with interoperability standards, and TLCMap conventions, such that information created in one can be exported and imported to another. This ensures a coherent, holistic infrastructure or software ecosystem that meets different needs, rather than a set of disparate research and development projects.
  6. Ensure no ‘infrastructure’ (software, data format, etc) is developed without a project to demonstrate usefulness, and no project is undertaken unless it produces ‘infrastructure’ in the form of re-usable and compliant software.
  7. A rapid prototyping development style allows for changes in priority, while still ensuring we produce working software and useful outcomes.

This means there isn’t a single ‘TLCMap’ to log in to. Rather we hope to direct you to what you need. You can access through the ‘themes’, or these FAQs. In some cases, because we don’t want to duplicate existing functionality these will simply direct you to good solutions, sometimes involving TLCMap software and projects. As development occurs over 2019 and 2020 the FAQs, systems and solutions will be added to as we go.

TLCMap enables humanities researchers to build digital maps, with pathways from beginner to advanced.

TLCMap provides access to platforms for humanities researchers to use, create and integrate datasets and to create interactive visualisations, specifically for Humanities’ epistemological and methodological needs.

Working within the ever evolving ecosystem of mapping software we are integrating with existing software and developing new tools and techniques to make common tasks easier and new things possible.

Digital map making is a growing area in humanities. Digital maps help answer research questions, turn research outcomes into research tools for others, and are an interactive and visual way to involve and engage the community.

TLCMap is an online research platform to deliver researcher driven national-scale infrastructure for the humanities, focused on mapping, time series, and data integration.

The TLCMap will expand the use of Australian cultural and historical data for research through sharply defined and powerful discovery mechanisms, enabling researchers to visualise hidden geographic and historical patterns and trends, and to build online resources which present to a wider public the rich layers of cultural data in Australian locations.

TLCMap is not a singular project or software application with a defined research outcome; it is infrastructure linking geo-spatial maps of Australian cultural and historical information, adapted to time series and will be a significant contribution to humanities research in Australia. For researchers, it will transform access to data and to visualisation tools and open new perspectives on Australian culture and history. For the public, it will enable increased accessibility to historical and cultural data through visualisations made available online and in print.

Humanities researchers are faced with a bewildering array of software, some established, some not, designed for a wide variety of tasks, some with similar functionality and varying degrees of effectiveness. Often a project would require the expense of employing a software developer for an extended period. We aim to provide easy entry points, based on different research needs and to ensure there is a way to progress to more advanced systems as the idiosyncratic needs of a Digital Humanities project emerge.

One of the main challenges to digital mapping infrastructure for research and humanities is the tendency for project software to be abandoned once results are acquired and the funding runs out. A great deal of value is lost as such software is often not made re-usable. At the same time, infrastructure investments often struggle to find users and projects and risk becoming, ‘Solutions in search of problems’.

Deb Verhoeven’s definition of ‘infrastructure’ provides a good way to understand what we are aiming to achieve – ‘the conditions of possibility for certain types of activity’. Rather than being a research project we aim to establish the conditions of possibility for Humanities Researchers to use Digital Maps for their research.

  • This is infrastructure: ‘the conditions of possibility for certain types of activity’ [Verhoeven]
  • No project without infrastructure. No infrastructure without a project.
  • Infrastructure saves wasted project development. Projects prove usefulness of infrastructure.
  • Bridge the gap between solutions looking for problems and projects that die on launch.
  • Ecosystem not monolith.
  • Integration before development or duplication.
  • This is humanities. Not STEM. Not commercial. There are fundamental epistemic and methodological differences. STEM and commercial technology are useful to humanities ends. We are constrained to use STEM and commercial technology but our needs are not limited to it – hence bricolage.
  • This is not only about space but time. Time is important not only to investigate changes represented by data but change in form.
  • Not only modern cartographic maps, but lienzos, paintings, imaginary places. Maps may be in any medium – a 2D picture, a 3D model, audio, video, song, dance, narrative, etc.

Some development, along with research will occur at participating institutions. The central development team is:

Bill Pascoe

System Architect

Alice Jackson

Project Manager

Matt Coller

Snr Front End Developer (Temporal Earth)

Zongwen Fan

Developer (TextMapText / Regito)

Kaine Usher


Ben McDonnell

Developer (Gazetteer)

Dan Price

Development, Content and Testing

The following, simple but important, easy but powerful, points will enable TLCMap systems and projects to work together as a cohesive whole rather than be a set of disparate research, development and humanities projects. These are all indicated as ‘should’ rather than ‘must’ as the diversity of areas of development means in reality some of these points may not be applicable. While some of these points may seem obvious and desirable and should be standard practice, there are many cases in which they are not implemented so it’s worth stipulating.

All TLCMap projects should, where feasible and allowable:

  • No infrastructure without projects. No system should be built without projects to demonstrate usefulness.
  • No project without infrastructure. No project should be undertaken unless it is using software that can be re-used for similar projects.
  • Import/Export in standard formats. All systems should enable import and export of spatiotemporal data in standard formats such as KML, GeoJSON, JSON-LD, CSV, ROCrate and other relevant standards.
  • Layers. Systems should allow import/creation/visualisation/use of more than one dataset at a time.
  • Web Services. Systems should expose information through RESTful web services APIs for potential re-use in other developed systems.
  • Public, private, group permissions. All systems should enable at least these privacy settings for people to work comfortably and collaboratively, and make information available if and when ready.
  • All entities should have unique URLs. All relevant entities within a system should be uniquely addressed with a URL. Through a web service the URL returns relevant data for that entity. Through user interfaces the application should ‘zoom’ to or load the entity when the URL is visited in the browser. What a relevant ‘entity’ is depends on the context and application. Eg: it may be a place on a map, a word in a text, an image, a section of an image etc.
  • Entity URLs in exported data. Data exported or made available through a web service should include the URL of the entity/record in the system exported from. This enables data to work among all compliant systems, eg: I am viewing a dataset in a time visualisation with other datasets – my dataset was created in a text analyser. While in the time visualiser, I can still directly access the relevant text for this point, by clicking the link to it that goes to the text analyser.
  • Entity URL chaining. Data imported and exported should not overwrite existing entity URLs, but add to a list of URLs from other systems. This enables linking among many systems through which the information has passed.