R-Forge Logo

T-LoCoH - Tips and FAQs

Tips

Tips are mini-tutorials or elaboration on specific topics or techniques.



FAQs


Selection of Method

Is T-LoCoH backward compatible with LoCoH?

Yes. T-LoCoH is the successor of classic LoCoH. The main difference between the two methods is T-LoCoH's inclusion of time in nearest neighbor identification. When s=0 (i.e., time ignored), T-LoCoH should produce the same hulls and isopleths as LoCoH.

Note however that in classic LoCoH the k-method constructs hulls from each point and its k-1 nearest neighbors. T-LoCoH, on the other hand, produces hulls from each point and its k (not k-1) nearest neighbors. So k=10 in LoCoH is equivalent to k=9 in T-LoCoH (with s=0). Parameter values for the adaptive and fixed-radius methods operate the same in both versions. R users are encouraged to use T-LoCoH rather than LoCoH because the T-LoCoH is faster, has more utility functions, can handle larger datasets, and is still supported.

How many points can T-LoCoH handle?

R is notoriously inefficient with memory, so it depends on your hardware, amount of physical memory, other applications open, and what you're doing in T-LoCoH (creating hulls and isopleths are memory intensive). Datasets up to 15,000 points run fine on modest laptop hardware, larger datasets run but are slower. Very large datasets can produce an out of memory error. A set of functions and workflow for k/a/r parameter selection has been developed for large datasets. Thinning down points with systematic sampling (i.e., keep every nth point), or analyzing different time periods separately, are also options (see lxy.subset). You can also run T-LoCoH on powerful machines in the cloud.

When would I *not* want to use T-LoCoH?

Spatial uncertainty. Spatial uncertainty is not modeled in T-LoCoH, so if the cascading effects of position error are important, you may want to use a method that explicitly includes estimation of spatial uncertainty (e.g., Brownian Bridges).

Need quick and dirty analysis. T-LoCoH does not provide good default values for parameters for a quick and dirty analysis. This has advantages, because it forces you to play with your data a bit and select parameter values that work well with your data, your study question, and your knowledge of the system (as opposed to selecting default values that someone else selected because they worked with another dataset). But if you need a "one-click" method that will give you a homerange estimate in 10 minutes or less, consider a method that automates parameter selection.


Setup and Installation

Is there a version of T-LoCoH for ArcGIS?

No, not at this time. There was an ArcToolbox for classic LoCoH, but it doesn't work with the current version of ArcGIS and is no longer supported.

I really hate R. Is there any other way to use T-LoCoH?

Unfortunately no, but we are working on a web GUI which will make it a little easier.

I'm having problems installing rgeos and/or rgdal on my Mac.

See Installing rgeos and rgdal on a Mac.


Parameter Selection

What method should I use (k/a/r)?

The k-method is fine if all you're interested in is density isopleths (utilization distributions). For all other hull metrics (including time-use metrics), you're better off using the a-method. If in doubt, use the a-method.

How do I find an upper and lower bound for a?

Try using the lxy.amin.add function with nnn=c(3,20). More to come later.


Data

What coordinate system should my data be in?

T-LoCoH can work with any coordinate system, including geographic coordinates (i.e., latitude-longitude). However it is recommended that the locations be in a coordinate system with real world units (e.g., meters), because hulls are sorted by area to form isopleths, and the area of a polygon expressed in degrees squared is niether very meaningful nor consistent. Area is also important for interpreting the size of isopleths. UTM is a popular coordinate system that works well for small to medium sized study sites. Appendix III of the T-LoCoH tutorial, Importing GPS Data into R, gives an example of projecting latitude-longitude data with R. T-LoCoH also has a function, lxy.reproject, which can reproject a LoCoH-xy object.

What's the difference between the 'id' and 'ptid' fields?

T-LoCoH's data structure for location data has an 'id' and 'ptid' value for each location. The 'id' is the name of an individual (animal) or GPS device. This field allows you to work with locations for several individuals in the same LoCoH-xy object, so you can analyze them or create hulls for them simultaneously. Another use of the 'id' field is to subdivide the dataset for a single individual into discrete groups, for example by season (e.g., flyn_dryseason, flyn_wetseason).

The 'ptid' (point id), on the other hand, is simply a unique numeric value for each location (i.e., primary key). It exists to help you link points and hulls to other variables associated with each hull. For example if you have a spreadsheet with soil type at each location, and have a column in the spreadsheet with a unique id for each location, you can pass a vector of unique point id values to xyt.lxy() and then any hulls created will have those ptid values also so you can analyze the hull metrics with the soil type.

What is the data structure of a locoh-hullset object?

See the vignette (pdf file) on T-LoCoH data classes by typing:

browseVignettes("tlocoh")


Analysis

I have created isopleths for different individuals (or different seasons), and now want to compare the area of overlap. How can I do this?

See the tip sheet Measuring area of intersection for isopleths

My GPS unit is programmed to go to sleep at night. How will this affect my analysis?

GPS devices are often programmed to power down when an animal is known to be inactive, such as night, to save battery power. The omission of night time locations is not a problem for modeling space use with T-LoCoH (or any other method) per se, but it may affect how you interpret your space use model, or compel you to make additional assumptions under which your space use model is valid.

The omission of night time locations is a specific case of a the more general problem of analyzing time series data with gaps. The first thing to note is that when there is no sampling at night, the estimated utilization distribution (isopleths) should be interpreted as the UD of the sampled data (i.e., daylight hours), and not necessarily the UD for the individual as a whole, even if the animal doesn't move at night. To illustrate why this caveat is necessary, take for example a dataset where the sampling is once an hour during the day, but the GPS unit goes to sleep from 6pm to 6am because the birds usually stay in the nest during this period. Now consider a nest site where the individual was recorded at 6pm, and again at 6am, because it never moved during the night. If the goal is to produce a utilization distribution that represents the overall relative intensity of habitat use, then that nest should really be represented by 13 locations to balance out the sampling during the day, not 2.

So we couldn't claim a space-use model generated from daytime points only represent a 24-hour model of space use. But what we can make a more narrow claim that the isopleths generated from our data represent how space is used during the day, and that based on our knowledge of the species, the 24 hour model of space use will be encompassed within the day time space use. The implications of that more conservative claim on the analysis depends on the research question. If our research question, for example, only demands an outline of the core area (50% isopleth), or the 'homerange' (95% isopleth), and the gradient of use within those contours doesn't really matter, then couching the interpretation of the space use model is probably not that big of a deal. However if we're going to feed our UD into another level of analysis where the gradient of the intensity of use matters, for example a parameter estimation for a resource selection function or volume intersection with another UD, then the omission of night time points might be more of a deal breaker.

Secondly, if you generate time-use metrics (e.g., nsv and mnlv), you certainly wouldn't want the intervisit gap period to be shorter than the period the device was programmed to go to sleep, otherwise it would appear as though a stationary individual (in the extreme case, dead) was making multiple visits to the nest, when in fact it hadn't moved at all.

If the omission of night time points is a deal-breaker for your analysis, you could potentially model the missing data (i.e., insert fake nightime points) based upon the assumption that the individual didn't move during the night. T-LoCoH doesn't have a function to do this, but it wouldn't be hard to make one. You could also test whether the insertion of fake points is reasonable or not by computing the distance between the last location of each day and the first location of the following day, and see if indeed the distances are actually close to zero.


Import / Export

Can I export my results to a format I can import into a GIS?

Yes, see the functions lhs.exp.shp and lxy.exp.shp

How do I "get" the hulls and isopleths, and what format are they in?

T-LoCoH stores hulls and isopleths as SpatialPolygonsDataFrame objects, which is common class for spatial data in R. You can use hulls and isopleths functions to extract these objects from a LoCoH-hullset object. You can also reference them directly from a LoCoH-hullset object (which are S3 lists, see the vignette on T-LoCoH data classes for complete details on the data structure).

Given a hullset object called toni which contains one set of hulls, we can extract those hulls with:


hulls_list <- hulls(my_lhs)
class(hulls_list[[1]]) ## returns 'SpatialPolygonsDataFrame'

## View polygon attributes
head(hulls_list[[1]]@data)


Animations

I want to do some additional editing to the individual frames of my animation before encoding them to video. How can I do this?

When running lxy.exp.mov, set tmp.dir to a directory where you want the frames produced. Then set tmp.files.delete=FALSE (so the frames are not deleted), create.mov=FALSE (so it doesn't actually encode the video, only make the frame) and show.cmd=TRUE (so it will display the ffmpeg command that would have been used to stitch the frames into an animation. After the frames are created, edit them however you like (i.e., with an image editor), and then open a command window / terminal, and run the ffmpeg command displayed in the output.

How do I put my animation on YouTube?

Run lxy.exp.mov with fmt="mp4" and then upload the result to YouTube or another video hosting site.



Questions? Suggestions? Please email the package author.

Back