POST
|
Sure: www.pcpartpicker.com. If, for example, you are still looking for a "mini" tower, I'd start by selecting that and then picking the other parts to make sure everything still fits inside the smaller box...pcpartpicker will let you know if everything is compatible. Note that any processor with a "K" at the end, e.g. i7-6700K, means that it is capable of being "overclocked" and needs a separate cooling fan (non-K processors come with their own little cooling fan). You probably don't need a "K" processor and you may have a tough time fitting a cooling fan into a mini tower if you go that route. But hey go nuts if it suits you. For graphics cards, Xander Baker's reply above is good info. I think I went with the GTX 1060 6gb a year ago, which is more than enough for ArcGIS Pro, so don't let yourself get overwhelmed by graphics cards because there are zillions out there and you can always upgrade later. Good luck.
... View more
11-08-2017
02:51 PM
|
0
|
0
|
146
|
POST
|
Hi Tanya, I was in this exact same position a year ago: I needed to upgrade from a laptop to a desktop so that I could crunch some data in ArcGIS for my dissertation. I knew that ArcGIS Pro was coming out (or had just come out) and that for a new person to ArcGIS, I knew that it was recommended to start out with ArcGIS Pro. I ended up more or less learning Desktop and Pro (end of the day, very similar capabilities), and I sometimes, sometimes, need to do something in Desktop, but for 99% of the things that I need to do, they are done in Pro (or QGIS - open source software). And Pro is a lot more user friendly to a newbie who has grown up working with the Microsoft Office suite their whole life. Why does that matter? Pro does suggest/require higher computer specs and especially, in your case, a better graphics card. Because Pro aims to provide a more or less seamless transition between 2D and 3D rendering, ESRI wants you to have a more powerful graphics card. The graphics card in the VivoMini is not good and it is "integrated", integrated means that it is soldered to the mother board and you cannot upgrade it. HOWEVER, this might not be an issue for you if you do not plan to do 3D rendering - if you just stick to 2D mapping, the graphics card that is integrated on the VivoMini should be just fine. Everything else in the VivoMini looks good (i7 processor, 16gbs RAM (Pro says minimum 8gb, recommends 16gb I believe), 500gb ssd, etc.). And the graphics card will be fine unless you want to render 3D. That size of SSD should also be fine UNLESS you plan to play with a lot of imagery. For example, I went nuts and downloaded 20 years worth of Landsat imagery for my area of interest...half-way through the download process I realized I needed more storage. So I ran out and bought a 1TB HDD just for storage - slower and cheaper than SSD, but fine for storage. Everyone has there plug and this is mine: I would suggest, for reasons of upgradeability (is that a word?), that you assemble your own computer. Its really pretty simple, I have no computer background and I was able to do it pretty easy. You can get a top-of-the-line desktop that meets your exact requirements and leaves room for upgradeability if you suddenly realize "man, really wish I could render 3D with a better graphics card," or "wow, that Landsat imagery really takes up a lot of space." I can post link to a website where you punch in your requirements and it checks for all part compatibility and best prices on each part, if your interested. If you're not even the slightest bit mechanically inclined, if you've never seen a screwdriver, I'd ask your friends/colleagues: chances are good someone has assembled their own computer and will help you. The other great thing about assembling your own system is that you avoid all the pre-installed bloatware that is ubiquitous these days. And you can put it in any case you like, for example, you could make your own "mini". And its cheaper. I'd also avoid buying a "gaming" system, I've done that before too, for the same reasons you mention. You pay more for stuff you don't need. Bottomline: i7 processor, 16gb RAM, SSD (size depends on you), graphics card (if you plan to render 3D).
... View more
11-02-2017
12:03 PM
|
2
|
2
|
963
|
POST
|
Thank you very much Dan and Adrian. And thanks for situating this in Analysis. Questions/ideas numbered at bottom again. I wanted to provide some updates to let you know I appreciated your assistance and in case anyone else stumbles across this thread. I added zeros, since they are valid observations, and this certainly helped produce useful choropleth maps for me. My explanatory variables are cleaned, look pretty, and I've been able to start some Exploratory Regression and OLS in ArcGIS Pro (my preferred platform, although I am comfortable with desktop). However, I've come to realize why adding zeros creates problems. And, nothing against the great ESRI team creating these example videos, but it seems a bit misleading to provide videos of crime incident statistical analysis where the zero counts have all been dropped, without really saying why they were dropped, but presumably dropped to produce statistically significant results (but not reflecting the real world). Zeros CAN be dropped (I've come to find out), but only if those zeros are normally distributed. Mine are not normally distributed and presumably most zeros people would like to drop are not normally distributed. I have come to find: the problem with zeros is that OLS/GWR cannot handle bounded, non-continuous data for the D.V., which often shows up in counts. And there is no way to Log or Exp transfer high zero counts to produce a normal distribution. This was a huge surprise to me when I learned this/figured out what was going on. To me, most interesting questions/data depend on counts and counts tend to have a lot of zeros (e.g. "Q: How many times last month did you rob a liquor store?" "A: Um, zero..."). Any suggestions? In no particular order, here are some options I am considering: 1. Interpolate crime to those zero count areas. Seems pretty shady to me, not sure it passes the sniff test, or whether it would even help because a majority of low-value counts is effectively the same thing as having majority zeros in this case. 2. Drop the zeros and just do spatial statistics (OLS/GWR) on those that have crime counts. The problem is all those nice, peaceful neighborhoods do not get to contribute there data to the outcome. 3. Ditch spatial statistics all together and move over to SPSS and run the Poisson/zero-inflated regression models which do not assume a normal curve. This is the option I am leaning towards, however, it pains me to not be able to take into account neighbors (and makes me second guess all those statistical findings running around in the world that cannot account for geography...). 4. Magic. Adrian, to answer your question, this is part of my dissertation. I was hoping that it would be a relatively small component. I wanted to round out the statistical portion with interviews/other qualitative data. But this portion is hanging me up. I will definitely consider publishing it to an ESRI storyboard/blog, etc. Thanks a lot for those suggestions.
... View more
10-17-2017
12:00 PM
|
0
|
1
|
599
|
POST
|
Hello, self-taught GIS analyst here (i.e. relative newbie), but not having any technical problems. Rather, I am having theoretical issues and trying to set up my data for analysis. Questions are near the bottom. I have a time-series of perhaps 800 crime incidents aggregated into communities polygon layer (i.e. a polygon for each of about 700 communities). The communities layer has a number of attributes including type of economy, population, buildings, area, wealth, etc. These are my independent variables. Together they form the community "social structure." And my dependent variable is "crime incidents". "Community" is a 4th-tier administrative unit in my research area (1st-tier is country, 2nd state, 3rd county, 4th community). My research question is "Which of the social structure I.V.s (or combination thereof) predicts the occurrence of crime incidents at the community level of analysis?" Originally, I thought that I needed to do some sort of hotspot analysis. In an early attempt (using fishnet polygon instead of community polygon to aggregate crime incidents), however, the hotspots were the size of 3rd tier (county) administrative units. And any fool that is remotely familiar with my study area could tell you where the crime "hotspots" are at the county level. So that won't work. I think I realize now that this occurred because: A feature with a high value is interesting but may not be a statistically significant hot spot. To be a statistically significant hot spot, a feature will have a high value and be surrounded by other features with high values as well (from Esri "How Hot Spot Analysis Works"). In other words, because the threshold between polygons with crime incidents was so great, the tool could only determine "statistically significant" areas that were roughly the size of the districts. You see, all 800 or so crime incidents occur in perhaps 75 communities, meaning that the vast majority of communities have no crime incident data, and those that do have incidents often are surrounded by communities without incidents. I thought perhaps if I added a "zero" for crime incidents in these communities, that might improve the analysis by giving the tool more data to crunch. Now, I don't think that is the case because: n cases where many of the grid polygons within the study area contain zeros for the number of events, increase the polygon grid size, if appropriate, or remove those zero-count grid polygons prior to analysis. Question #1: Why would I remove zeros for Hotspot Analysis? Doesn't that simply reduce my sample size? Question #2: Do I really want to do a "hotspot analysis"? Since the "crime communities" are rather disbursed, can I not just symbolize each polygon community by crime incident count (i.e. since almost all surrounding communities for a "crime community" have zeros, any crime is a "hotspot")? Which leads to my final question... Question #3: My research question above... How or what GIS tool would allow me to determine what community polygon social structural attributes have the most affect on crime incidents? For reasons of data limitations, I had to assign slightly imprecise geolocated crime incidents to one central community when in reality incidents probably fell across 3 or 4 adjacent communities. Therefore, I'd like to be able to use inferential statistics to not only analyze the influence of my social structure attributes in the specific "crime community" but also account for the social structural impact of communities adjacent to the "crime community" at some spatial threshold. Since the crime incidents are time stamped and some of the community attributes also vary temporally, ideally I'd also like to incorporate this into the analysis (e.g. if a community experiences a crime at Time 1, does that influence probability of crime at Time 2?). But I don't want to get ahead of myself just now. Apologies for the length but any assistance would be GREATLY appreciated. Thanks!
... View more
09-17-2017
02:58 PM
|
0
|
5
|
1621
|
POST
|
Any chance you ever figured this out? I'm running into this same problem tonight... Thanks!
... View more
09-02-2017
08:23 PM
|
0
|
0
|
245
|
Title | Kudos | Posted |
---|---|---|
2 | 11-02-2017 12:03 PM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:25 AM
|