Discussion group for the members and faculty of the NEH Funded Institute for Digital Archaeology Method & Practice (http://digitalarchaeology.msu.edu) organized by Michigan State University’s Department of Anthropology and MATRIX: The Center for Digital Humanities and Social Sciences
Northern Indigenous Copper Database
December 15, 2015 at 3:11 am #318
I wanted to go ahead and get a thread going for my project, with some updates on some of the progress I’ve been making and some of the thoughts and questions I have moving forward. Much of the last few months have been spent on research trips adding to and organizing my copper database. I have more than doubled the number of artifacts I have now examined and intend on including (and also doubled my photographs, analytical analyses, geospatial data, and other features to be included in the database). In addition, I have been setting up trips into Dené and Inuit communities in the Northwest Territories and Nunavut, in part to consult on issues of indigenous representation in my online database. Currently I am unsure of the timing of these consultation trips, although it is beginning to look as if they will be next fall – after next years meeting of MSUDAI. Either way works, as I will have a workable version of the database by then.
In terms of technical details on the creation of the database, I don’t have as much to report. The main decision that I have made is to explore the use of the Nunaliit Atlas Framework (www.nunaliit.org) as a basis for constructing my database. It’s benefits include a ready-made design that balances geospatial and metadata visualization and the ability to incorporate multiple media types, including sound and video. In addition, it was initially designed for collaboration with northern indigenous communities and includes the ability to incorporate data from multiple registered users and function offline for remote data collection. It’s design (from examining comments and suggestions in the user community) takes into consideration accessibility from remote communities that may have variable internet access. It has been used for traditional place names research among Gwich’in Dené (http://atlas.gwichin.ca/index.html) and Kitikmeot Inuit (http://atlas.kitikmeotheritage.ca/index.html?module=module.intro) and the documentation of traditional Inuit trails (http://paninuittrails.org/index.html), and Inuit sea ice use and occupancy (http://sikuatlas.ca/index.html). Not only are these good examples of Nunaliit’s potential functionality, they also provide examples of other projects that have used Nunaliit in the specific communities I am targeting. Sharing visual and functional similarities with these other projects would facilitate easier interaction with my project. It also apparently was in part funded by one of the institutions I am collaborating with – the Kitikmeot Heritage Society (http://www.kitikmeotheritage.ca/). Nunaliit is open source with a GitHub page (https://github.com/GCRC/nunaliit/wiki), and appears to have periodic updates with update support for users. It seems like a pretty good fit, my only real question (and it’s not a small one) is how to organize the data on the back end.December 16, 2015 at 2:56 pm #335
Reading your vision statement and the above makes me think you are beginning to make very good progress. The software you have chosen looks interesting, though I guess you’ll have to be a Java wizard to really hack it around. The examples of what it can do are a bit varied in quality, but I like the Gwichin one especially.
You are planning to do a lot in the year, do you think it is achievable from your point of view, are you aiming too high? What is the most important part of this project for you to deliver?
You asked about database schema – have you made a start on this, what sort of data do you have?
DanDecember 16, 2015 at 6:20 pm #342
My main goal for the product that I want to present to the Institute in August will be a basic online map that has the ability to display geospatial data, images, and text metadata (standard categories such as tool type, cultural group, associated radiocarbon ages, etc). Ideally this will be navigable through the map and searchable though the metadata. The idea is that this will be a prototype of sufficient functionality that I can take it live, and use the prototype as a tool for consultations with communities and researchers in the future. The plan is for the project to continue to grow and have a life after the institute. Some of the other functions of Nunaliit (such as community contributions by other registered users) is beyond the scope of what I want to complete by August. I also have (or will have) the artifacts that I want to display subdivided along a number of categories (culture group, time period, complete/fragmentary) so I can focus on a specific subset if the entire database become to unwieldy in the prototype.
I am not tied to a particular database format. Much of the data is in a Microsoft Access database and Excel spreadsheets, but it has not all been incorporated into an ultimate organizational system yet. The types of data I have include –
-geospatial points (archaeological sites – currently in ArcGIS, but I have the coordinates and can export them to other types.)
-.jpg images of artifacts
-X-Ray Fluorescence spectra (in an odd file type, but can be saved as images or excel spreadsheets)
-text metadata for artifact context (culture, time period, tool type, etc.)
I am envisioning some form of relational database that will allow me to have flexibility in grouping the data (e.g. all objects from one site; all sites that have this object type; this object type from this time period; etc) for both visualization and later analysis. Ideally those selective organizational criteria will be a part of the online product as well, as long as that doesn’t overstep my time limitations.
The only concerns I have at the moment in terms of the database are access and site confidentiality. Much of this is raw data and has yet to be analyzed or published on, so I would prefer some delay in making the data completely open. There are a number of people who have contributed to the dataset, so I don’t have complete control over the dissemination of the raw data. Second, I want to preserve site location confidentiality. My ideas for that are to either truncate coordinates (i.e. – remove the seconds from the coordinates) or display them only by their Borden Codes (the Canadian site system – essentially a grid over the whole country. It would probably function very similar to the way DINAA does it).
That’s basically my thoughts. An online map that displays images and contextual data, that I have control over the raw data and can be scaled up or adapted in the future.
MattJanuary 6, 2016 at 9:46 pm #390
Cool project, and great to see you making progress! On the 3D side of things, I think you should look into this:
It’s open Web-friendly technologies for 3D in browsers without plugins. Some of the main concerns center on longer term care of the data. Are you going to use Kora, tDAR, OC, or something else for digital preservation? I think that preservation in those places is of secondary concern and the more important issues are the needs of the tribal communities you are partnering with. But it’ll be good to address the issue so the long term disposition of the digital data is acceptable to everyone involved.January 18, 2016 at 10:07 pm #440
Thanks for the x3dom tip, I will have to investigate that further. I have been messing around with an Artec Spider 3D scanner that I have access to in our department, and I am hoping to take it north with me when I go to obtain 3D scans of artifacts that have made their way into local homes in the region. I will have to see if the software and file types are compatible. Ultimately I think the 3D portion of the project will end up being an upgrade at some point, although it might be nice to have one or two examples when I publish the database as a protoype for feedback from the Institute and northern communities.
In terms of the repository database, I am still not sure of the best way to proceed. The Nunaliit Atlas framework uses CouchDB to organize data, although I’m honestly not sure if that falls into the same category of digital archives like Kora and tDar. I believe I’ve gotten a little overwhelmed by the potential choices of software, and I’m not 100% sure of the best way to make that decision. Ultimately data for the project will be archived on university servers at the Purdue University Research Repository (PURR), but this isn’t a repository that allows for easy integration with things like web-mapping.
I am currently working on downloading and loading the Nunaliit Atlas Framwork, which involves downloading CouchDB and other programs. The process is more complicated than I expected, with a lot of command-line instructions that I am not experienced working with. I’ll continue to chip away at that in order to get an assessment of the Nunaliit software and see if it is the ideal product to be using.March 10, 2016 at 2:32 pm #595
I have switched gears in the past couple weeks, which I mentioned in my previous blog post but neglected to mention in the discussion group here. I have abandoned Nunaliit for a variety of reasons, foremost of which was the fact that it was taking way to much time and energy to even learn how to install the program, let alone learn how to manipulate it. I have since switched to using Bootleaf, with the ieldran database as a template, and publishing the website through github.
I feel as if I have made more progress in the last week than I have in the last month, so I believe this was a good decision. I have been playing with the window dressing (fonts, colors, etc), but have gotten stuck on the specifics of converting .csv files to .geojson and projecting them on the map. I have been able to use other peoples public trial and error on the Commons to get myself up and running so far (thanks everyone!) but haven’t come across a solution to this problem yet. In the ieldran script, it looks like the geojson files are being loaded using $.getJSON, which pulls from a geojson file on the github page, but I’m not sure how the .csv files housed in github are converted to those .geojson files. Any help from anyone would be greatly appreciated.June 21, 2016 at 7:24 pm #717
I am a bit stuck on integrating points into my leaflet map. I’ve found a number of different options, but am having trouble figuring out how to actually include them on my site.
The Leaflet plugin that I’ve been trying (and failing) to integrate is leaflet-simple-csv (https://github.com/perrygeo/leaflet-simple-csv) . Another one I was looking at was Leaflet.geoCSV (https://github.com/joker-x/Leaflet.geoCSV). I’m pretty sure that its just my lack of experience and practice with coding, but I cant for the life of me figure out how to integrate it.
My Github page is https://github.com/wanderer33third/MINeS, and my website is hosted at http://wanderer33third.github.io/MINeS/MINeS.html. Any help or guidance would be much appreciated. I’ll keep at itJune 21, 2016 at 10:05 pm #722
Where’s your csv hosted? Can I see it?June 21, 2016 at 10:41 pm #723
I don’t actually have one at the moment, but that seemed the easiest way to do it. I was modeling it somewhat off the way that Katy put together ieldran, and planning on creating a file called data.csv, which is how the leaflet-simple-csv is designed, and putting it up on github. I can put a file together tomorrow and put it up on github, i just have to do some modification of site locations before my excel file touches github. My main mental stumbling block at the moment is how to integrate the code from a plugin into the (admittedly rudimentary) structure I currently have. Thanks for the quick response, I’ll be sure to get a .csv up on my github by tomorrow at the latest.June 21, 2016 at 10:53 pm #724
Have you seen this? https://gist.github.com/mgiraldo/93c3457c4f512b34433f
This is a nice example of csv being rendered on a map:June 22, 2016 at 12:15 am #726
What Dan said… 🙂
Anyway, probably the best thing to do is look at the examples that illustrate how CSV plugins for Leaflet work in practice. Normally the people that write the plugins have some demo pages, and those are usually the best places to start in figuring out how to use the plugin.
The example Dan posted above uses another good option that I was looking at – the Omnivore plugin. I suppose this is why branching is so central to Github, I’ll just have to play around with the examples in a branch and see what works and what doesn’t. I think that the different programming styles are probably throwing me off in getting started trying to integrate something, since I’m not sure exactly how they function. Thanks for the suggestions, I’m planning on trying to spend a good chunk of today on the problem, so I’ll see if I can get something halfway workable by the end of the day.June 22, 2016 at 7:38 pm #729
Ok, so I have switched over to pulling my map from Mapbox so that I can try and use the omnivore Plugin. I chose Omnivore mainly because it seems like the most polished, and I can also access it using a CDN instead of putting all the code into my html doc. It seems relatively straightforward, but my only stumbling block at the moment (that I know of) is pulling in the .csv data. In the plugin example, the data seems to be coming from Mapbox.js and I’m not sure how to have it pull from my Github. I now have a rough data.csv file on Github to play around with, so I will keep digging on either how to upload the file to Mapbox or how to connect to Github.
Im also planning on incorporating the MarkerCluster plugin (https://www.mapbox.com/mapbox.js/example/v1.0.0/markercluster-with-mapbox-data/), but I figure i should get my data to sync first before trying to get fancy.
June 23, 2016 at 11:51 am #731
- This reply was modified 5 years, 4 months ago by Matthew Pike.
Matt, for using Omnivore with your csv, you can have a look at my github (https://github.com/dngupta/mina.github.io). I have the csv on Github, and use Omnivore to parse the csv to geojson. The code is towards the bottom of index.html
The script takes geojson and adds it to a map. Brian’s original code for DAEA had simple markers with pop-up on click, which I have added as well.
Very interested in your clusterMarkers, as I plan to play with that js as well.June 23, 2016 at 11:53 am #732
You must be logged in to reply to this topic.
active 3 years, 7 months ago
active 3 years, 7 months ago
active 3 years, 7 months ago
active 4 years, 2 months ago
active 4 years, 2 months ago
active 1 year ago
active 2 years, 5 months ago
active 2 years, 6 months ago
active 2 years, 6 months ago
active 3 years, 2 months ago