More

How to replicate ArcGIS Intersect in PostGIS

How to replicate ArcGIS Intersect in PostGIS


I'm trying to replicate this ArcGIS process in PostGIS: http://blogs.esri.com/esri/arcgis/2012/11/13/spaghetti_and_meatballs/. This describes how to break buffered points down into polygons based on their intersections, counting the number of layers, and attributing that to the polygons in order to classify them. I'm using it to create a rough point density map with vectors, and the results were surprisingly nice for my data set in ArcGIS. However, I am struggling to come up with something workable in PostGIS where I need it for producing dynamic point density layers for a web map.

In ArcGIS, I simply ran the Intersect tool on my buffered points layer to create the shapes I needed.

In PostGIS, I ran this query:

CREATE TABLE buffer_table AS SELECT a.gid AS gid, ST_Buffer(a.geo,.003) AS geo FROM public.pointTable a;

CREATE TABLE intersections AS SELECT a.gid AS gid_a, b.gid AS gid_b, ST_Intersection(a.geo,b.geo) AS geo FROM public.pointTable a, public.pointTable b WHERE ST_Intersects(a.geo, b.geo) AND a.gid < b.gid;

DELETE FROM intersections WHERE id_a = id_b;

The output looks pretty much identical to the ArcGIS output, except that it is not breaking the polygons down to the same extent that is required for a meaningful density map. Here are screenshots of what I mean:

ArcGIS is on the left, and PostGIS is on the right. It is slightly difficult to tell, but the ArcGIS image shows the 'interior' polygon created where all 3 buffers intersect. The PostGIS output, on the other hand, does not create that interior polygon and instead it keeps its components intact. This makes it impossible to provide a classification for just that interior area with 3 layers on top of each other compared to just 1 for the outer parts.

Does anyone know of any PostGIS function to break the polygon down to the extent I need? Alternatively, does anyone know of a better way to produce a point density map with vectors in PostGIS?


You can do this all in one step by chaining the CTEs together, but I did it in several so I could look at the results in QGIS as I progressed.

First, generate a bunch of random points to work with, using a gaussian distribution so we get more overlap in the middle.

create table pts as with rands as ( select generate_series as id, random() as u1, random() as u2 from generate_series(1,100)) select id, st_setsrid(st_makepoint( 50 * sqrt(-2 * ln(u1)) * cos(2*pi()*u2), 50 * sqrt(-2 * ln(u1)) * sin(2*pi()*u2)),4326) as geom from rands;

Now buffer the points into circles so we get some overlap.

create table circles as select id, st_buffer(geom, 10) as geom from pts;

Now, extract just the boundaries from the circles. If you have polygons with holes, you'll have to use ST_DumpRings() and get more fancy here. I have simple polygons so I cheat. Once you have the boundaries, union them against themselves (actually any small piece of coincident linework will do) to force them to be noded and deduplicated. (This is magic.)

create table boundaries as select st_union(st_exteriorring(geom)) as geom from circles;

Now rebuild areas using the noded linework. This is the broken down areas, with only one polygon per area. After polygonizing, dump the individual polygons out of the multipolygon output.

create sequence polyseq; create table polys as select nextval('polyseq') as id, (st_dump(st_polygonize(geom))).geom as geom from boundaries;

Now, add a place for the polygon count and fill it up by joining the centroids of the small cut-up polygons to the original circles, and summarizing for each small piece. For larger data sets an index on the circles table at least will be required to make things not impossibly slow.

create index circles_gix on circles using gist (geom); alter table polys add column count integer default 0; update polys set count = p.count from ( select count(*) as count, p.id as id from polys p join circles c on st_contains(c.geom, st_pointonsurface(p.geom)) group by p.id ) as p where p.id = polys.id;

That's it, you now have no overlapping polygons, but each resultant polygon has a count on it that says how many overlaps it is standing in for.


The method I ended up using was to create a fishnet grid in my area of interest with a high enough "resolution" to style and reflect the data to reasonable degree. You can read about the fishnet function here: How to create a regular polygon grid in PostGIS?

CREATE TABLE fishnet AS SELECT * FROM ST_CreateFishnet(800,850,.0005,.0005,-104.9190,38.7588);

This creates the fishnet with 800 rows, 850 columns, that are 0.0005 radians in height and length (using WGS84 projection in lat/long and its a small enough geographic extent that the distortion is negligible - ie they're all distorted more or less equally), and then the coordinates for the bottom left of the grid.

UPDATE fishnet SET geom = ST_SetSRID(geom,4326); CREATE INDEX fishnet_geom ON fishnet USING gist (geom); ANALYZE fishnet;

Because this created a huge amount of polygons which will have queries ran on them, I created an index and updated the statistics. This reduced my typical queries from 50+ seconds to 4-5 seconds.

SELECT ST_Union(a.geom), a.count FROM (SELECT count(*) as count, fishnet.geom as geom FROM fishnet, incidents WHERE ST_DWithin(incidents.geo,fishnet.geom,.002) AND (incidents.incidenttype = 'Burglary') GROUP BY fishnet.geom) a WHERE a.count >= 3 GROUP BY a.count;

The subquery here counts the number of incidents within .002 radians (approx 220 meters) of each fishnet grid polygon, and groups them by the fishnet grid. This effectively counts the number of overlapping circles to the resolution of the grid.

The outer query I used to Union each polygon's count value, and restrict the count to 3 or greater. While the union isn't strictly necessary and is the most resource intensive part of the query, it is critical for web mapping as it effectively turns tens of thousands of grid polygons, which doesn't work too well when serving directly to openlayers, into multipolygons of however many different count values there are (usually a few dozen for my data).

Restricting the count value is an important ability for heat maps so they don't depict too much data to the point of being unable to interpret it - it also has the added utility of speeding the query up significantly.

Final result:


  • Microsoft SQL Server Geometry and Geography
  • PostgreSQL PostGIS Geometry and Geography
  • Oracle SDO Geometry

The Output Table will have a record for each geometry problem discovered. If no problems are found, the table will be empty.

The Output Table contents, including the PROBLEM codes, are written in English.

For point features, only the null geometry problem is possible.

To facilitate the review of the features that are reported to have geometry problems, you can join the Input Features to the Output Table using the Add Join tool, along with the input's OBJECTID or FID field and the output table's FEATURE_ID field.

The Esri validation method ensures that geometry is topologically legal using the Esri Simplify method. Only the Esri validation is available for data stored in an enterprise geodatabase.

The Open Geospatial Consortium (OGC) validation method ensures that geometry complies with the OGC specification as defined in OpenGIS Implementation Standard for Geographic information – simple feature access – Part 1: common architecture.

After a feature's geometry is repaired using the OGC option, any subsequent edit or modification may cause the geometry to no longer comply with the OGC specification. After feature modification, run the Check Geometry tool to check for new geometry issues. If necessary, rerun the Repair Geometry tool.

The OGC simplify method does not support nonlinear segments such as Bézier curves, circular arcs, and elliptic arcs. These types of segments will have to be densified using the Densify tool on the input dataset before running Check Geometry . To avoid irreversibly changing nonlinear segments when running the Densify tool, make a copy of the data first. To determine whether your data has nonlinear segments, use the Add Geometry Attributes tool.

The problems identified by this tool can be addressed in the following ways:

  • Manually edit and fix the feature with the geometry problems. Some of the problems cannot be fixed through editing.
  • Use the Repair Geometry tool. Some problems associated with data stored in an enterprise database may not be repairable with ArcGIS tools.

The Output Table has the following fields:

  • CLASS —The full path to and name of the feature class in which the problem was found.
  • FEATURE_ID —The Feature ID (FID) or Object ID (OID) for the feature with the geometry problem.
  • PROBLEM —A short description of the problem.

The PROBLEM field will contain one of the following codes:

  • Short segment —Some segments are shorter than allowed by the system units of the spatial reference associated with the geometry.
  • Null geometry —The feature has no geometry or nothing in the SHAPE field.
  • Incorrect ring ordering —The polygon is topologically simple, but its rings may not be oriented correctly (outer rings clockwise inner rings counterclockwise).
  • Incorrect segment orientation —Individual segments are not consistently oriented. The to point of segment i should be incident on the from point of segment i+1 .
  • Self intersections —A polygon must not intersect itself.
  • Unclosed rings —The last segment in a ring must have its to point incident on the from point of the first segment.
  • Empty parts —The geometry has multiple parts, and one of them is empty (has no geometry).
  • Duplicate vertex —The geometry has two or more sequential vertices with identical coordinates.
  • Mismatched attributes —The z- or m-coordinate of a line segment's endpoint does not match the z- or m-coordinate of the coincident endpoint on the next segment.
  • Discontinuous parts —One of the geometry's parts is composed of disconnected or discontinuous parts.
  • Empty Z values —The geometry has one or more vertices with an empty z-value (NaN, for example).
  • Bad envelope —The envelope does not match the coordinate extent of the geometry.
  • Bad dataset extent —The extent property of the dataset does not contain all of the features in the dataset. For this problem, the FEATURE_ID will be -1.
  • NEEDS_REORDERING —The shape is OK except that it needs to be reordered and/or have duplicate points removed.
  • SE_INVALID_ENTITY_TYPE —Invalid entity type.
  • SE_SHAPE_INTEGRITY_ERROR —Shape integrity error.
  • SE_INVALID_SHAPE_OBJECT —The given shape object handle is invalid.
  • SE_COORD_OUT_OF_BOUNDS —The specified coordinate exceeds the valid coordinate range.
  • SE_POLY_SHELLS_OVERLAP —Two donuts or two outer shells overlap.
  • SE_TOO_FEW_POINTS —The number of points is less than required for the feature.
  • SE_INVALID_PART_SEPARATOR —Part separator in the wrong position.
  • SE_INVALID_POLYGON_CLOSURE —Polygon does not close properly.
  • SE_INVALID_OUTER_SHELL —A polygon outer shell does not completely enclose all donuts for the part.
  • SE_ZERO_AREA_POLYGON —Polygon shell has no area.
  • SE_POLYGON_HAS_VERTICAL_LINE —Polygon shell contains a vertical line.
  • SE_OUTER_SHELLS_OVERLAP —Multipart area has overlapping parts.
  • SE_SELF_INTERSECTING —Linestring or poly boundary is self-intersecting.

How to replicate ArcGIS Intersect in PostGIS - Geographic Information Systems

I try to delete the slivers that appeared when I have merged some polygons.

I have two different cases:

For the CASE 1, no problem I fill the gap with:

A function that fill all the gaps smaller than x square meter.

BUT for the CASE 2, when the small gaps are "open" I can't manage to fill those gaps.

I have try something like: buffer(1) followed by buffer(-1) but of course I obtain some rounded corners. Did someone have a true solution ?

You can avoid to get rounded corner by adding the parameter 'join=mitre' to st_buffer:

So it work perfectly fine with:

If the buffer is too large st_buffer(geom,10,'join=mitre') produce from time to time some strange results (the polygones become spiky). So to avoid this effect it's safer to preserve the old boundary with st_intersection:


Hfrhyu

Found this skink in my tomato plant bucket. Is he trapped? Or could he leave if he wanted?

malloc in main() or malloc in another function: allocating memory for a struct and its members

Google .dev domain strangely redirects to https

Does the Rock Gnome trait Artificer's Lore apply when you aren't proficient in History?

latest version of QGIS fails to edit attribute table of GeoJSON file

How do Java 8 default methods hеlp with lambdas?

What are some likely causes to domain member PC losing contact to domain controller?

Why weren't discrete x86 CPUs ever used in game hardware?

Why complex landing gears are used instead of simple, reliable and light weight muscle wire or shape memory alloys?

Did any compiler fully use 80-bit floating point?

Putting class ranking in CV, but against dept guidelines

Does the transliteration of 'Dravidian' exist in Hindu scripture? Does 'Dravida' refer to a Geographical area or an ethnic group?

My mentor says to set image to Fine instead of RAW — how is this different from JPG?

Weaponising the Grasp-at-a-Distance spell

How does TikZ render an arc?

Does the main washing effect of soap come from foam?

Should a wizard buy fine inks every time he want to copy spells into his spellbook?

What is the proper term for etching or digging of wall to hide conduit of cables

Why do C and C++ allow the expression (int) + 4?

Where and when has Thucydides been studied?

Relating to the President and obstruction, were Mueller's conclusions preordained?

Using Google BigQuery GIS table as a data source to QGIS

Unicorn Meta Zoo #1: Why another podcast?Loading GeoJSON into QGISImport GeoJSON into PostGISHow to save data from QGIS 1.8 to PostGIS without shapefile intermediate?How to convert a CSV from DMS to WGS84 in QGIS?Why are some points in my PostGIS layer not displayed?How to build custom CRS to export QGIS to Google Fusion TableQGIS geolocate a csv (insert point for each record manually in map)Using field calculator for attribute table data in QGISChecking whether shapefile has overlapping polygons?Connecting QGIS to table hosted on ArcGIS Server?3D globe from GIS data (QGIS shapefiles or PostGIS)Analyzing precipitation data with conflict data in QGIS

Google BigQuery recently added GIS functionality (still in beta), see here.

I have a dataset in BigQuery that I would like to use as a data source in QGIS - much the same way as I would do with a normal PostGIS database or a CSV file.

Is there a way to do this?

I guess I could export the data from bigquery into a PostGIS instance in GCS, and then I would be able to load it into QGIS. But that doesn't feel like the right way to do it.

Size of data is not a concern, and I'm using QGIS 3.4.2 (Madeira).

What do you think about exporting as GeoJSON and then importing in QGIS

Thanks for the tip, that's one possibility but it has the problem that if I update my data I will have to export and import it again. Ideally, I would like to avoid this - same way as I would if I was using a postGIS db

You are welcome, sorry for a poor suggestion. I am not an expert in this domain, so probably let's wait for a better hint/answer. Maybe you have to develop an add-on to let data update in PostGIS after the GeoJSON would be automatically created, e.g. Import GeoJSON into PostGIS

What I can think is creating a connection with R using the libraries bigqueryR and RQGIS3, everything in a single script otherwise you should program a plugin in QGIS. @JoãoCoelho

Thanks for the suggestion! I'll keep that in mind if I can't find a more straightforward solution. I'll also look into what it takes to create a plugin for QGIS - if you have any thoughts or tips on that feel free to share them :)

Google BigQuery recently added GIS functionality (still in beta), see here.

I have a dataset in BigQuery that I would like to use as a data source in QGIS - much the same way as I would do with a normal PostGIS database or a CSV file.

Is there a way to do this?

I guess I could export the data from bigquery into a PostGIS instance in GCS, and then I would be able to load it into QGIS. But that doesn't feel like the right way to do it.

Size of data is not a concern, and I'm using QGIS 3.4.2 (Madeira).

What do you think about exporting as GeoJSON and then importing in QGIS

Thanks for the tip, that's one possibility but it has the problem that if I update my data I will have to export and import it again. Ideally, I would like to avoid this - same way as I would if I was using a postGIS db

You are welcome, sorry for a poor suggestion. I am not an expert in this domain, so probably let's wait for a better hint/answer. Maybe you have to develop an add-on to let data update in PostGIS after the GeoJSON would be automatically created, e.g. Import GeoJSON into PostGIS

What I can think is creating a connection with R using the libraries bigqueryR and RQGIS3, everything in a single script otherwise you should program a plugin in QGIS. @JoãoCoelho

Thanks for the suggestion! I'll keep that in mind if I can't find a more straightforward solution. I'll also look into what it takes to create a plugin for QGIS - if you have any thoughts or tips on that feel free to share them :)

Google BigQuery recently added GIS functionality (still in beta), see here.

I have a dataset in BigQuery that I would like to use as a data source in QGIS - much the same way as I would do with a normal PostGIS database or a CSV file.

Is there a way to do this?

I guess I could export the data from bigquery into a PostGIS instance in GCS, and then I would be able to load it into QGIS. But that doesn't feel like the right way to do it.

Size of data is not a concern, and I'm using QGIS 3.4.2 (Madeira).

Google BigQuery recently added GIS functionality (still in beta), see here.

I have a dataset in BigQuery that I would like to use as a data source in QGIS - much the same way as I would do with a normal PostGIS database or a CSV file.

Is there a way to do this?

I guess I could export the data from bigquery into a PostGIS instance in GCS, and then I would be able to load it into QGIS. But that doesn't feel like the right way to do it.

Size of data is not a concern, and I'm using QGIS 3.4.2 (Madeira).

What do you think about exporting as GeoJSON and then importing in QGIS

Thanks for the tip, that's one possibility but it has the problem that if I update my data I will have to export and import it again. Ideally, I would like to avoid this - same way as I would if I was using a postGIS db

You are welcome, sorry for a poor suggestion. I am not an expert in this domain, so probably let's wait for a better hint/answer. Maybe you have to develop an add-on to let data update in PostGIS after the GeoJSON would be automatically created, e.g. Import GeoJSON into PostGIS

What I can think is creating a connection with R using the libraries bigqueryR and RQGIS3, everything in a single script otherwise you should program a plugin in QGIS. @JoãoCoelho

Thanks for the suggestion! I'll keep that in mind if I can't find a more straightforward solution. I'll also look into what it takes to create a plugin for QGIS - if you have any thoughts or tips on that feel free to share them :)

What do you think about exporting as GeoJSON and then importing in QGIS

Thanks for the tip, that's one possibility but it has the problem that if I update my data I will have to export and import it again. Ideally, I would like to avoid this - same way as I would if I was using a postGIS db

You are welcome, sorry for a poor suggestion. I am not an expert in this domain, so probably let's wait for a better hint/answer. Maybe you have to develop an add-on to let data update in PostGIS after the GeoJSON would be automatically created, e.g. Import GeoJSON into PostGIS

What I can think is creating a connection with R using the libraries bigqueryR and RQGIS3, everything in a single script otherwise you should program a plugin in QGIS. @JoãoCoelho

Thanks for the suggestion! I'll keep that in mind if I can't find a more straightforward solution. I'll also look into what it takes to create a plugin for QGIS - if you have any thoughts or tips on that feel free to share them :)

What do you think about exporting as GeoJSON and then importing in QGIS

What do you think about exporting as GeoJSON and then importing in QGIS

Thanks for the tip, that's one possibility but it has the problem that if I update my data I will have to export and import it again. Ideally, I would like to avoid this - same way as I would if I was using a postGIS db

Thanks for the tip, that's one possibility but it has the problem that if I update my data I will have to export and import it again. Ideally, I would like to avoid this - same way as I would if I was using a postGIS db

You are welcome, sorry for a poor suggestion. I am not an expert in this domain, so probably let's wait for a better hint/answer. Maybe you have to develop an add-on to let data update in PostGIS after the GeoJSON would be automatically created, e.g. Import GeoJSON into PostGIS

You are welcome, sorry for a poor suggestion. I am not an expert in this domain, so probably let's wait for a better hint/answer. Maybe you have to develop an add-on to let data update in PostGIS after the GeoJSON would be automatically created, e.g. Import GeoJSON into PostGIS

What I can think is creating a connection with R using the libraries bigqueryR and RQGIS3, everything in a single script otherwise you should program a plugin in QGIS. @JoãoCoelho

What I can think is creating a connection with R using the libraries bigqueryR and RQGIS3, everything in a single script otherwise you should program a plugin in QGIS. @JoãoCoelho

Thanks for the suggestion! I'll keep that in mind if I can't find a more straightforward solution. I'll also look into what it takes to create a plugin for QGIS - if you have any thoughts or tips on that feel free to share them :)

Thanks for the suggestion! I'll keep that in mind if I can't find a more straightforward solution. I'll also look into what it takes to create a plugin for QGIS - if you have any thoughts or tips on that feel free to share them :)


How to actually learn or practice programming?

I understand that learning how to program is technically one of the requirements in GIS. I realize that there are many free courses online, but how would you actually study for it? Do you just watch the video and try to learn? What would you recommend to beginners? I am trying to learn python and SQL preferably.

Any advice would be appreciated!

Edits: Thanks all for you comments!!

There's a ton you can do with python without ever accessing the arcpy module. Start by creating a script that auto dumps the contents of a mounted sd card to a hard drive, that runs every time one is mounted. This can be accomplished a couple different ways, but if you can get good with task scheduler and python, it's a good place to start. Once you're confortable with that, work on iterating through folders to recreate complex folder structures and stuff. Google a lot.

SQL might be a little more difficult to work in if you don't have access to SSMS and SQL express or my SQL or something because it has a little more up front setup. However, you can use MS Access to learn a ton about SQL. Make a simple database in Access and use the build query tools to write to it and query from it. Once you've got a couple of those written, look at the SQL view of your scripts and try to unpack what you're looking at and re-write them without using the builder, but with the written scripts as a cheat sheet when you need it. I taught myself both of these on the job doing this and while I'm sure there's some more direct way to accomplish the same, learning how to teach yourself is extremely rewarding. Message me if you want guidance. I went through the same thing you're going through 3 years ago. Getting a job will help a lot.


Where are they Now?

  • Graeme, Phil, Jeff and Chris are still doing geospatial consulting at Refractions Research.
  • Dave maintained and improved PostGIS for the first couple years. He left Refractions for other work, but still works in open source geospatial from time to time, mostly in the world of GeoServer and other Java projects.
  • I found participating in the growth of PostGIS very exciting, and much of my consulting work… less exciting. In 2008, I left Refractions and learned enough C to join the PostGIS development community as a contributor, which I’ve been doing ever since, currently as a Executive Geospatial Engineer at Crunchy Data.

Thursday, 29 November 2018

Getting shapefile of river from OpenStreetMap?

I do not want to download the whole map of an area. I just want rivers and other waterbodies present in an area in shapefiles so that I can use them in QGIS. I tried geofabrik but it is giving whole map and that also of full country while i need data of a city. Right now I am trying JOSM. will update you when it will works.

1) Search for the feature you're after. I chose "south platte river", which runs through Denver. This gives the fields and tags that are used by OSM to store the data:

2) Identify the tags and values of the features you're after by


  1. Zooming all the way into the map
  2. Click on the layers icon on the right (the three sheets of paper)
  3. Click on the last menu entry ( Map data or something similar in your language)
  4. The features on the map turn blue (make sure you're zoomed in far enough to see
  5. Click on the feature you're after

Then go on over to the Overpass Turbo page, then click Wizard

Using information, the name value is South Platte River, and the waterway value is river, so you can build a query like this:

Then click "build and run query"

The query will run and the result will show on the map:

Next click the "Export" option:

I like the geoJSON option

Open the file in QGIS, and away you go!

You can do a 'save as' to save it as a new type of vector layer.

In your case, you could also use the waterway=river query to get all the rivers in the area you're after, and you can draw a manual selection box to narrow down the geography.

Qgis - How to merge many individual KML layers into one?

I'd like to open a USGS KML file of bedrock geology for an entire county, but there are over 2,000 individual polygon layers. Does anyone know if there is a way to merge all of these layers together into one file before opening in QGIS?

Coordinate system - Should point data be equidistant projected when using ArcGIS IDW spatial interpolation?

I am wondering if anyone can clarify whether point data should be projected to an equidistant projection when using ArcGIS IDW spatial interpolation?

I am working on a dataset from Western North America spanning about 30 degrees of latitude. The data are currently in Lat/Long (NAD83).

Does ArcGIS "project on the fly" or somehow adjust for latitude when calculating distances to my sample points during the interpolation procedure or should I be supplying everything in a projection that preserves distances?

IDW works by finding the data points located nearest each point of interpolation, weighting the data values according to a given power p of the distances to those points, and forming the weighted average. (Often p = -2.)

Suppose there is some amount of distance distortion around an interpolation point that is the same in all directions. This will multiply all distances by some constant value x. The weights therefore all get multiplied by x^p. Because this does not change the relative weights, the weighted average is the same as before.

When the distance distortion changes with direction, this invariance no longer holds: data points in some directions now appear (on the map) relatively closer than they should while other points appear relatively further. This changes the weights and therefore affects the IDW predictions.

Consequently, for IDW interpolation we would want to use a projection that creates roughly equal distortions in all directions from each point on the map. Such a projection is known as conformal. Conformal projections include those based on the Mercator (including Transverse Mercator (TM)), Lambert Conic, and even Stereographic.

It is important to realize that conformality is a "local" property. This means that the distance distortion is constant across all bearings only within small neighborhoods of each point. For larger neighborhoods involving greater distances, all bets are off (in general). A common--and extreme--example is the Mercator projection, which is conformal everywhere (except at the poles, where it is not defined). Its distance distortion becomes infinite at sufficiently large north-south distances from the Equator, while along the Equator itself it's perfectly accurate.

The amount of distortion in some projections can change so rapidly from point to point that even conformality will not save us when the nearest neighbors are far from each other or near the extremes of the projection's domain. It is wise, then, to choose a conformal projection adapted to the study region: this means the study region is included within an area where its distortion is the smallest. Examples include the Mercator near the Equator, TM along north-south lines, and Stereographic near either pole. In the conterminous US, the Lambert Conformal Conic is often a good default choice when the reference latitudes are placed within the study region but near its northern and southern extremes.

These considerations usually are important only for study regions that extend across large countries or more. Within small countries or states of the US, popular conventional coordinate systems exist (such as various national grids and State Plane coordinates) which introduce little distance distortion within those particular countries or states. They are good default choices for most analytical work.


Watch the video: QGIS Tutorial: Open WMSWMTS EN