Group Admins

  • Profile picture of Ethan Watrall

Institute for Digital Archaeology Method & Practice

Public Group active 5 years, 11 months ago

Discussion group for the members and faculty of the NEH Funded Institute for Digital Archaeology Method & Practice ( organized by Michigan State University’s Department of Anthropology and MATRIX: The Center for Digital Humanities and Social Sciences

Northern Indigenous Copper Database

This topic contains 25 replies, has 6 voices, and was last updated by  Matthew Pike 6 years ago.

Viewing 15 posts - 1 through 15 (of 26 total)
  • Author
  • #318

    Matthew Pike

    I wanted to go ahead and get a thread going for my project, with some updates on some of the progress I’ve been making and some of the thoughts and questions I have moving forward. Much of the last few months have been spent on research trips adding to and organizing my copper database. I have more than doubled the number of artifacts I have now examined and intend on including (and also doubled my photographs, analytical analyses, geospatial data, and other features to be included in the database). In addition, I have been setting up trips into Dené and Inuit communities in the Northwest Territories and Nunavut, in part to consult on issues of indigenous representation in my online database. Currently I am unsure of the timing of these consultation trips, although it is beginning to look as if they will be next fall – after next years meeting of MSUDAI. Either way works, as I will have a workable version of the database by then.

    In terms of technical details on the creation of the database, I don’t have as much to report. The main decision that I have made is to explore the use of the Nunaliit Atlas Framework ( as a basis for constructing my database. It’s benefits include a ready-made design that balances geospatial and metadata visualization and the ability to incorporate multiple media types, including sound and video. In addition, it was initially designed for collaboration with northern indigenous communities and includes the ability to incorporate data from multiple registered users and function offline for remote data collection. It’s design (from examining comments and suggestions in the user community) takes into consideration accessibility from remote communities that may have variable internet access. It has been used for traditional place names research among Gwich’in Dené ( and Kitikmeot Inuit ( and the documentation of traditional Inuit trails (, and Inuit sea ice use and occupancy ( Not only are these good examples of Nunaliit’s potential functionality, they also provide examples of other projects that have used Nunaliit in the specific communities I am targeting. Sharing visual and functional similarities with these other projects would facilitate easier interaction with my project. It also apparently was in part funded by one of the institutions I am collaborating with – the Kitikmeot Heritage Society ( Nunaliit is open source with a GitHub page (, and appears to have periodic updates with update support for users. It seems like a pretty good fit, my only real question (and it’s not a small one) is how to organize the data on the back end.


    Daniel Pett

    Hi Matthew,
    Reading your vision statement and the above makes me think you are beginning to make very good progress. The software you have chosen looks interesting, though I guess you’ll have to be a Java wizard to really hack it around. The examples of what it can do are a bit varied in quality, but I like the Gwichin one especially.
    You are planning to do a lot in the year, do you think it is achievable from your point of view, are you aiming too high? What is the most important part of this project for you to deliver?
    You asked about database schema – have you made a start on this, what sort of data do you have?


    Matthew Pike

    My main goal for the product that I want to present to the Institute in August will be a basic online map that has the ability to display geospatial data, images, and text metadata (standard categories such as tool type, cultural group, associated radiocarbon ages, etc). Ideally this will be navigable through the map and searchable though the metadata. The idea is that this will be a prototype of sufficient functionality that I can take it live, and use the prototype as a tool for consultations with communities and researchers in the future. The plan is for the project to continue to grow and have a life after the institute. Some of the other functions of Nunaliit (such as community contributions by other registered users) is beyond the scope of what I want to complete by August. I also have (or will have) the artifacts that I want to display subdivided along a number of categories (culture group, time period, complete/fragmentary) so I can focus on a specific subset if the entire database become to unwieldy in the prototype.

    I am not tied to a particular database format. Much of the data is in a Microsoft Access database and Excel spreadsheets, but it has not all been incorporated into an ultimate organizational system yet. The types of data I have include –

    -geospatial points (archaeological sites – currently in ArcGIS, but I have the coordinates and can export them to other types.)
    -.jpg images of artifacts
    -X-Ray Fluorescence spectra (in an odd file type, but can be saved as images or excel spreadsheets)
    -text metadata for artifact context (culture, time period, tool type, etc.)

    I am envisioning some form of relational database that will allow me to have flexibility in grouping the data (e.g. all objects from one site; all sites that have this object type; this object type from this time period; etc) for both visualization and later analysis. Ideally those selective organizational criteria will be a part of the online product as well, as long as that doesn’t overstep my time limitations.

    The only concerns I have at the moment in terms of the database are access and site confidentiality. Much of this is raw data and has yet to be analyzed or published on, so I would prefer some delay in making the data completely open. There are a number of people who have contributed to the dataset, so I don’t have complete control over the dissemination of the raw data. Second, I want to preserve site location confidentiality. My ideas for that are to either truncate coordinates (i.e. – remove the seconds from the coordinates) or display them only by their Borden Codes (the Canadian site system – essentially a grid over the whole country. It would probably function very similar to the way DINAA does it).

    That’s basically my thoughts. An online map that displays images and contextual data, that I have control over the raw data and can be scaled up or adapted in the future.



    Eric Kansa

    Hi Matthew,

    Cool project, and great to see you making progress! On the 3D side of things, I think you should look into this:

    It’s open Web-friendly technologies for 3D in browsers without plugins. Some of the main concerns center on longer term care of the data. Are you going to use Kora, tDAR, OC, or something else for digital preservation? I think that preservation in those places is of secondary concern and the more important issues are the needs of the tribal communities you are partnering with. But it’ll be good to address the issue so the long term disposition of the digital data is acceptable to everyone involved.


    Matthew Pike

    Thanks for the x3dom tip, I will have to investigate that further. I have been messing around with an Artec Spider 3D scanner that I have access to in our department, and I am hoping to take it north with me when I go to obtain 3D scans of artifacts that have made their way into local homes in the region. I will have to see if the software and file types are compatible. Ultimately I think the 3D portion of the project will end up being an upgrade at some point, although it might be nice to have one or two examples when I publish the database as a protoype for feedback from the Institute and northern communities.

    In terms of the repository database, I am still not sure of the best way to proceed. The Nunaliit Atlas framework uses CouchDB to organize data, although I’m honestly not sure if that falls into the same category of digital archives like Kora and tDar. I believe I’ve gotten a little overwhelmed by the potential choices of software, and I’m not 100% sure of the best way to make that decision. Ultimately data for the project will be archived on university servers at the Purdue University Research Repository (PURR), but this isn’t a repository that allows for easy integration with things like web-mapping.

    I am currently working on downloading and loading the Nunaliit Atlas Framwork, which involves downloading CouchDB and other programs. The process is more complicated than I expected, with a lot of command-line instructions that I am not experienced working with. I’ll continue to chip away at that in order to get an assessment of the Nunaliit software and see if it is the ideal product to be using.


    Matthew Pike

    I have switched gears in the past couple weeks, which I mentioned in my previous blog post but neglected to mention in the discussion group here.  I have abandoned Nunaliit for a variety of reasons, foremost of which was the fact that it was taking way to much time and energy to even learn how to install the program, let alone learn how to manipulate it.  I have since switched to using Bootleaf, with the ieldran database as a template, and publishing the website through github.

    My Github –

    the webpage –

    I feel as if I have made more progress in the last week than I have in the last month, so I believe this was a good decision.  I have been playing with the window dressing (fonts, colors, etc), but have gotten stuck on the specifics of converting .csv files to .geojson and projecting them on the map.  I have been able to use other peoples public trial and error on the Commons to get myself up and running so far (thanks everyone!) but haven’t come across a solution to this problem yet.  In the ieldran script, it looks like the geojson files are being loaded using $.getJSON, which pulls from a geojson file on the github page, but I’m not sure how the .csv files housed in github are converted to those .geojson files.  Any help from anyone would be greatly appreciated.



    Matthew Pike

    Hello all,

    I am a bit stuck on integrating points into my leaflet map.  I’ve found a number of different options, but am having trouble figuring out how to actually include them on my site.

    The Leaflet plugin that I’ve been trying (and failing) to integrate is leaflet-simple-csv ( .  Another one I was looking at was Leaflet.geoCSV (  I’m pretty sure that its just my lack of experience and practice with coding, but I cant for the life of me figure out how to integrate it.

    My Github page is, and my website is hosted at  Any help or guidance would be much appreciated.  I’ll keep at it


    Daniel Pett

    Where’s your csv hosted? Can I see it?


    Matthew Pike

    I don’t actually have one at the moment, but that seemed the easiest way to do it.  I was modeling it somewhat off the way that Katy put together ieldran, and planning on creating a file called data.csv, which is how the leaflet-simple-csv is designed, and putting it up on github.  I can put a file together tomorrow and put it up on github, i just have to do some modification of site locations before my excel file touches github.  My main mental stumbling block at the moment is how to integrate the code from a plugin into the (admittedly rudimentary) structure I currently have.  Thanks for the quick response, I’ll be sure to get a .csv up on my github by tomorrow at the latest.


    Daniel Pett

    Have you seen this?

    This is a nice example of csv being rendered on a map:


    Eric Kansa

    What Dan said… 🙂

    Anyway, probably the best thing to do is look at the examples that illustrate how CSV plugins for Leaflet work in practice. Normally the people that write the plugins have some demo pages, and those are usually the best places to start in figuring out how to use the plugin.

    That said, there are lots of different programming styles in Javascript, so it can be really hard to grok, especially at first. I’m very easily confused by it still.


    Matthew Pike

    The example Dan posted above uses another good option that I was looking at – the Omnivore plugin.  I suppose this is why branching is so central to Github, I’ll just have to play around with the examples in a branch and see what works and what doesn’t.  I think that the different programming styles are probably throwing me off in getting started trying to integrate something, since I’m not sure exactly how they function.  Thanks for the suggestions, I’m planning on trying to spend a good chunk of today on the problem, so I’ll see if I can get something halfway workable by the end of the day.


    Matthew Pike

    Ok, so I have switched over to pulling my map from Mapbox so that I can try and use the omnivore Plugin.  I chose Omnivore mainly because it seems like the most polished, and I can also access it using a CDN instead of putting all the code into my html doc.  It seems relatively straightforward, but my only stumbling block at the moment (that I know of) is pulling in the .csv data.  In the plugin example, the data seems to be coming from Mapbox.js and I’m not sure how to have it pull from my Github.  I now have a rough data.csv file on Github to play around with, so I will keep digging on either how to upload the file to Mapbox or how to connect to Github.

    Im also planning on incorporating the MarkerCluster plugin (, but I figure i should get my data to sync first before trying to get fancy.

    • This reply was modified 6 years, 1 month ago by  Matthew Pike.

    Neha Gupta

    Matt, for using Omnivore with your csv, you can have a look at my github ( I have the csv on Github, and use Omnivore to parse the csv to geojson. The code is towards the bottom of index.html

    The script takes geojson and adds it to a map. Brian’s original code for DAEA had simple markers with pop-up on click, which I have added as well.

    Very interested in your clusterMarkers, as I plan to play with that js as well.




    Ethan Watrall

    yup – what Neha says.  Have a look at the source of the Digital Atlas of Egyptian Archaeology (  It uses omnivore to pull info from a local csv to display pins and pops ups on a map.

Viewing 15 posts - 1 through 15 (of 26 total)

You must be logged in to reply to this topic.


Account Activated

Your account was activated successfully! You can now log in with the username and password you provided when you signed up.


Account sign-in

Please use the form below to sign-in to your account.

Forgot password?

Recover password

Please enter your username or email address. You will receive a link to create a new password via email.

We've sent you an activation link. Please check your inbox.


Account signup

1 Account Info

2 Personal Info

Registering for this site is easy, just fill in the fields below and we'll get a new account set up for you in no time.

In order to avoid spam, automatic account registration is restricted to emails from the following domains – .edu, .org, .gov. To register with a different email address, please write to to request an account.


A confirmation link has been emailed to you

Skip to toolbar