Tool: ArcGIS Pro 2.9.3 Technique: Overlay analysis, visualization via remote sensing technique
These maps are developed to aid or supplement the Natural Capital Valuation (NatCap) initiative. As cited by WWF:
An essential element of the Natural Capital Project is developing tools that help decision makers protect biodiversity and ecosystem services.
One of the site included in this initiative by WWF-Malaysia is the Heart of Borneo (HoB). Specifically for this exercise, the visualization of policy and land use eventually become the data input utilized in the tool InVest that generates the models and maps for the economic values of ecosystem services within the landscape of interest.
The generation of the data mainly includes superficial remote sensing to assess the status of the land use in the respective concessions using Sentinel-2 satellite image with specific band combination to identify tree cover, particularly mangrove forest.
To cater for my lack of knowledge in biological data sampling and analysis, I actually signed up for the 'Wildlife Study Design and Data Analysis' organized by Biodiversity Conservation Society Sarawak
So, this new year, I've decided to take it down a notch and systematically choose my battlefield. Wildlife species data has always been mystery at me. As we all know, biologists hold them close to their hearts to the point of annoyance sometimes (those movies with scientists blindly running after some rare orchids or snakes or something like that really wasn't kidding). Hey...I get it and I totally agree - the data that belongs to the organization has to be treated with utmost confidentiality and all by the experts that collects them. Especially since we all know that they are not something so easily retrieved. Even more so, I optimistically support for the enthusiasm to be extended to their data cleaning and storing too while they're at it. But it doesn't mean I have to like the repercussions. Especially not when someone expects a habitat suitability map from me and I have no data to work with and all I had is a ping-pong game of exchanging jargon in the air with the hopes that the other player gets what you mean cough up something you can work with. Yes...there is not a shred of shame here when I talk about how things work in the world, but it is what it is and I'm not mad. It's just how it works in the challenging world of academics and research.
To cater for my lack of knowledge in biological data sampling and analysis, I actually signed up for the 'Wildlife Study Design and Data Analysis' organized by
Biodiversity Conservation Society Sarawak (BCSS for short)
or
Pertubuhan Biodiversiti Konservasi Sarawak
It just ended yesterday and I can't say I did not cry internally. From pain and gratitude and accomplishment of the sort. 10 days of driving back and forth between the city center and UNIMAS was worth the traffic shennanigans.
It is one of those workshops where you really do get down to the nitty-gritty part of understanding probability distribution from scratch; how to use it for your wildlife study data sampling design and analyzing them to obtain species abundance, occupancy or survival. And most importantly, how Bayes has got anything to do with it. I've been hearing and seeing Bayesian stats, methods and network on almost anything that involves data science, R and spatial stats that I am quite piffed that I did not understand a thing. I am happy to inform that now, I do. Suffice to say that it was a bootcamp well-deserved of the 'limited seats' reputation and the certificate really does feel like receiving a degree. It dwindles down to me realizing a few things I don't know:
I did not know that we have been comparing probabilities instead of generating a 'combined' one based on a previous study all these years.
I did not know that Ronald Fisher had such strong influence that he could ban the usage of Bayesian inference by deeming it unscientific.
I did not know that, for Fisher, if the observation cannot be repeated many times and is uncertain, then, the probability cannot be determined - which is crazy! You can't expect to shoot virus into people many times and see them die to generate probability that it is deadly!
I did not know that Bayes theorem actually combines prior probability and the likelihood data you collected on the field for your current study to generate the posterior probability distribution!
I did not know that Thomas Bayes was a pastor and his theory was so opposed to during his time. It was only after Ronald Fisher died that Bayesian inference gain favor especially in medical field.
I did not know...well...almost anything at all about statistics!
It changed the way I look at statistics basically. But I self-taught myself into statistics for close to 9 years and of course I get it wrong most of the time; now I realize that for the umpph-th time. And for that, I hope the statistics power that be forgives me. Since this boot camp was so effective, I believe it is due to their effort in developing and executing the activities that demonstrates what probability distribution models we were observing. In fact, I wrote down the activities next to the topic just to remember what the deal was. Some of the stuffs covered are basics on Binomial Distribution, Poisson Distribution, Normal/Gaussian Distribution, Posterior probability, Maximum Likelihood Estimate (MLE), AIC, BACI, SECR, Occupancy and Survival probability. Yes...exhausting and I have to say, it wasn't easy. I could listen and distracted by paper falling for a fraction of time just to find myself lost in the barrage of information. What saved me was the fact that we have quizzes that we have to fill in to evaluate our understanding of the topic for the day and discuss them first thing in the next session. Best of all, we were using R with the following packages: wiqid, unmarked, rjags and rasters. Best locations for camera traps installation was discussed as well and all possible circumstances of your data; management and collection itself on the field, were covered rigorously.
For any of you guys out there who are doing wildlife study, I believe that this boot camp contains quintessential information for you to understand to design your study better. Because once the data is produced, all we can do it dance around finding justification of some common pitfalls that we could've countered quite easily.
In conclusion, not only that this workshop cast data analysis in a new light for me, but it also helps establishes the correct steps and enunciates the requirements to gain most out of your data. And in my case, it has not only let me understand what could be going on with my pals who go out into the jungle to observe the wildlife first hand, it has also given me ideas on looking for the resources that implements Bayesian statistics/methods on remote sensing and GI in general. Eventhough location analysis was not discussed beyond placing the locations of observation and occasions on the map, I am optimistic in further expanding what I understood into some of the stuff I'm planning; habitat suitability modeling and how to not start image classification from scratch...every single time if that's even possible.
For more information on more workshops by BCSS or wildlife study design and the tools involved, check out the links below:
Biodiversity Conservation Society Sarawak (BCSS) homepage: https://bcss.org.my/index.htm
BCSS statistical tutorials: https://bcss.org.my/tut/
Mike Meredith's home page: http://mikemeredith.net/
And do check out some of these cool websites that I have referred to for more information as well as practice. Just to keep those brain muscles in loop with these 'new' concepts:
Statistical Rethinking: A Bayesian Course with Examples in R and Stan: https://github.com/rmcelreath/statrethinking_winter2019
Probability Concepts Explained: Introduction by Jonny Brooks-Bartlett: https://towardsdatascience.com/probability-concepts-explained-introduction-a7c0316de465
Probability Concepts Explained: Maximum Likelihood Estimation by Jonny Brooks-Bartlett: https://towardsdatascience.com/probability-concepts-explained-maximum-likelihood-estimation-c7b4342fdbb1
Probability Concepts Explained: Bayesian Inference for Parameter Estimation by Jonny Brooks-Bartlett
I'll be posting some of the things I am working on while utilizing the Bayesian stats. I'd love to see yours too!
P/S: Some people prefer to use base R with its simple interface, but if you're the type who works better with everything within your focal-view, I suggest you install RStudio. It's an IDE for R that helps to ease the 'anxiety' of using base R.
P/S/S: Oh! Oh! This is the most important part of all. If you're using ArcGIS Pro like I do, did you know that it has R-Bridge that can enable the accessibility of R workspace in ArcGIS Pro? Supercool right?! If you want to know more on how to do that, check out this short 2 hour course on how to get the extension in and an example on how to use it:
Using the R-Bridge: https://www.esri.com/training/catalog/58b5e417b89b7e000d8bfe45/using-the-r-arcgis-bridge/
With this, I am commencing my submission for the #30DayMapChallenge for 2023 🗺
The categories outlined is similar to that of last year but I am never going to hate this repetition. How can I? It's a basics of making maps and there's so much to learn from the single-word theme.
Any aspiring map-makers out there? Let's share our maps for this wonderful month of November under the #30DayMapChallenge 2023!
Tool: ArcGIS Pro 2.6.3 Technique: Symbolization, labeling and SQL expression
MBR 2023 is a peak event that culminates all the effort of data collection and stock take of hydrocarbon resource in the Malaysia. It is an annual event that put together all the exploration blocks, discoverable hydrocarbon fields and late life assets for upstream sectors to evaluate and invest in.
Leading up to the event, the Malaysia Petroleum Management (MPM) updates, re-evaluate and produces maps; static and digital, to cater to the need for the most update stock-take of information that can be gained from various source of exploration output; seismic, full tensor gradiometry, assets; cables, pipelines, platforms, as well as discoverable resources. This year's them aims to include various prospects and initiative to align the industry itself with lower carbon emission and to explore the option for carbon capture storage (CCS) attempts in the popular basins such as the Malay and Penyu Basin. This is a big follow-up with the closing of MBR 2022 with the PSC signing for 9 blocks a few days earlier.
Credit: Sh Shahira Wafa Syed Khairulmunir Wafa
Over ~70 maps for unique blocks have been produced during the finalization stage, ~210 maps during data evaluation and additional 20 for the event. And this excludes the standardized maps to formalize information requested by prospective bidders as well as clients who are facing prospects of extending their contract.
The standardization of the map requires the optimization of workflow and standard templates to cater to rapid changes and exporting to rapid output.
For more information on the event, please access the following resources:
PETRONAS: Malaysia Bid Round
PETRONAS myPROdata
The Malaysian Reserve: Petronas offers 10 exploration blocks in MBR 2023
Tool: ArcGIS Pro, ArcGIS Pro Deep Learning extension, Python, Jupyter Notebook Technique: Deep learning; semantic segmentation, cartography, remote sensing
The presentation of abstract outlining the implementation of deep learning in land cover classification across the Borneo island. It uses the Sentinel-2 image data and the band combination that differentiates the bareland, tree cover as well as waterbodies and croplands whilst training the U-Net model using the referenced data collected.
Please find the abstract published here:
Warta Geologi, Vol. 47, No. 1, April 2021
The presentation slide can be accessed at the following link 👇🏻:
Hunting for spatial data comes naturally now. There seems to be less and less opportunity for doubts when we could attach a pair of coordinates to some places.
For work and hobby, hunting for data take almost half of the usable hours I set aside to execute certain objectives; if not 100%. Although the internet is a vast plain of data, not all of them are usable. The democratization of data is a subject that is to translucent to discuss but to solid to argue with. Thus, with differing opinions, we get different versions of them online. Here are some of the interesting data platforms I manage to scour based on their thematic subject
🌳 Nature and Environment
Delta at Risk - Profiling Risk and Sustainability of Coastal Deltas of the World. I found this while lamenting on how people love asking for data addition into their maps at the eleventh hour. I find their confidence in my skills quite misleading but flattering nonetheless. But it does not make it any less troublesome.
Protected Planet - Discover the world's protected and conserved areas. This platform includes not just data of protected areas, but also other effective area-based conservation measures like ICCAs IUCN listing and as the website claims, it is updated regular via submissions from agencies. So far, I found this platform to be the most convenient since it rounds up all possible conservation-based themes which also includes World Heritage Sites.
Global Forest Change (2000-2020) - The global forest extent change since 2000 to the current year or lovingly referred to as the Hansen data by most forestry RS specialist. This data is updated annually and to be honest, the platforms are literally everywhere. But this platform is legitimate under Earth Engine Apps and you can refer to Google Earth Engine for future data updates to ease your search.
👩⚖️ Administrative Data
GADM - Map and spatial data for all countries and their sub-divisions.
🏦 Built-environment Data
OpenStreet Map - This database is the most amazing feat of tech-aware crowdsourcing. A little more than 2 decades ago, some 'experienced' gate-keeping professionals would have refuted its legitimacy within an inch of their lives but OSM has proven that time prevails when it comes to bringing the accessibility and network data into practical use. I am not that adept with downloading from this website so I go directly to a more manual data download. My favorite is the Geofabrik Download but you can also try Planet OSM.
🎮 Other Cool Data
OpenCell ID - Open database platform of global cell towers. Cleaning the data is a nightmare but I think it is just me. I have little patience for cerebral stuff.
So, those are some of the data I managed to dig for personal projects. Hope it helps you guys too!
Here’s a quick run down of what you’re supposed to do to prepare yourself to use Python for data analysis.
Install Python ☑
Install Miniconda ☑
Install the basic Python libraries ☑
Create new environment for your workspace
Install geospatial Python libraries
Let’s cut to the chase. It’s December 14th, 2021. Python 3 is currently at 3.10.1 version. It’s a great milestone for Python 3 but there were heresay of issues concerning 3.10 when it comes to using it with conda. Since we’re using conda for our Python libraries and environment management, we stay safe by installing Python 3.9.5.
Download 👉🏻 Python 3.10.1 if you want to give a hand at some adventurous troubleshooting
Or download 👉🏻 Python 3.9.5 for something quite fuss-free
📌 During installation, don’t forget to ✔ the option Add Python 3.x to PATH. This enables you to access your Python from the command prompt.
As a beginner, you’ll be informed that Anaconda is the easiest Python library manager GUI to implement conda and where it contains all the core and scientific libraries you ever need for your data analysis upon installation. So far, I believe it’s unnecessarily heavy, the GUI isn’t too friendly and I don’t use most of the pre-installed libraries. So after a few years in the darkness about it, I resorted to jump-ship and use the skimped version of conda; Miniconda.
Yes, it does come with the warning that you should have some sort of experience with Python to know what core libraries you need. And that’s the beauty of it. We’ll get to installing those libraries in the next section.
◾ If you’re skeptical about installing libraries from scratch, you can download 👉🏻 Anaconda Individual Edition directly and install it without issues; it takes some time to download due to the big file and a tad bit longer to install.
◾ Download 👉🏻 Miniconda if you’re up to the challenge.
📌 After you’ve installed Miniconda, you will find that it is installed under the Anaconda folder at your Windows Start. By this time, you will already have Python 3 and Anaconda ready in your computer. Next we’ll jump into installing the basic Python libraries necessary for core data analysis and create an environment to house the geospatial libraries.
Core libraries for data analysis in Python are the followings:
🔺 numpy: a Python library that enables scientific computing by handling multidimensional array objects, or masked objects including matrices and all the mathematical processes involved.
🔺 pandas: enables the handling of ‘relational’ or 'labeled’ data structure in a flexible and intuitive manner. Basically enables the handling of data in a tabular structure similar to what we see in Excel.
🔺matplotlib: a robust library that helps with the visualization of data; static, animated or interactive. It’s a fun library to explore.
🔺 seaborn: another visualization library that is built based on matplotlib which is more high-level and produces more crowd-appealing visualization. Subject to preference though.
🔺 jupyter lab: a web-based user interface for Project Jupyter where you can work with documents, text editors, terminals and or Jupyter Notebooks. We are installing this library to tap into the notebook package that is available with this library installation
To start installing:
1️⃣ At Start, access the Anaconda folder > Select Anaconda Prompt (miniconda3)
2️⃣ An Anaconda Prompt window similar to Windows command prompt will open > Navigate to the folder you would like to keep your analytics workspace using the following common command prompt codes:
◽ To backtrack folder location 👇🏻
◽ Change the current drive, to x drive 👇🏻
◽ Navigate to certain folders of interest e.g deeper from Lea folder i.e Lea\folder_x\folder_y 👇🏻
3️⃣ Once navigated to the folder of choice, you can start installing all of the libraries in a single command as follows:
The command above will enable the simultaneous installation of all the essential Python libraries needed by any data scientists.
💀 Should there be any issues during the installation such as uncharacteristically long installation time; 1 hour is stretching it, press Ctrl + c to cancel any pending processes and proceed to retry by installing the library one by one i.e
Once you manage to go through the installation of the basic Python libraries above, you are half way there! With these packages, you are already set to actually make some pretty serious data analysis. The numpy, pandas and matplotlib libraries are the triple threat for exploratory data analysis (EDA) processes and the jupyter lab library provides the documentation sans coding notebook that is shareable and editable among team mates or colleagues.
Since we’re the folks who like to make ourselves miserable with the spatial details of our data, we will climb up another 2 hurdles to creating a geospatial workspace using conda and installing the libraries needed for geospatial EDA.
If you're issues following the steps here, check out the real-time demonstration of the installations at this link 👇🏻
See you guys in part 2 soon!
I am a reckless uninspired person. I call myself a map-maker but I don't really get to make maps for reasons that I don't think I should venture outside of my requesters' requests. But mostly, I am compelled to get it right and I feel good if I can deliver what they need. The thing is, I no longer get spontaneously inspired to make maps anymore. Just as the rules become clearer the more you read books on cartography, fear just crop themselves up like 'Plant vs Zombies' 🌱 in PlayStation.
So, I am scared that I'm beginning to wear off my excitement about making map; really making them and not just knowing how to make them.
What sort of idea is great? I mean, what should I focus on trying to make? There are so many data out there that what I will attempt may be missing the train or just pale in comparison to other incredible work. I don't really mind it but I'm not that young to not understand self-esteem does ease the thinking process.
Can't say much, I mean...30 Days of Map Challenge hasn't been all that well with me. I should've prepared something before the event event started. I quit after the 3rd challenge cause I overthink and get panic attacks every time I feel I'm doing stuff half-ass.
Despite all that, I am lucky to have aggressively supportive siblings. They just can't seem to stop the tough love and always kicking me to just barf something out.
'It's the process that matters!'
When did I start forgetting how wonderful the process, huh?
Coding is one of the things I have aspired to do since like...forever! But finding a resource in-sync with my comprehension, schedule and able to retain my interest long enough is a challenge.
I have the attention span of a gnat so, I jumped everywhere! If I am not actively engaged with the learning, I just can't do it. And I know...we have DataCamp, Udemy, Khan Academy and even Kaggle...but I either can't keep up, too poor to pay for the full course or it couldn't sync with me enough. I believe I can say that most of the exercise doesn't 'vibe' with me.
Recently, I committed myself to my one passion; running. It's one of my favorite activities when I was back in school but the will to really run died a decade ago. I have recently picked up my running shoes and ran my little heart out despite having the speed of a running ant; aging perhaps? And I owe my hardcore will to the motivation of earning what I paid when I decided to join a 1-month long virtual run of 65km. It is called the 'Pave Your Path' virtual run organized by
Running Station
. Nailed it 2 days ago after 13 sessions of 5km - yes, you can accumulate the distance from multiple runs. It made me realize that...it's not that bad. The 'near-death' experience while running kinda turned me into a daredevil these days when it comes to undertaking some things I'd whine about doing a few months back.
"If I can go through dying every single evening for 5km long run...I can handle this,"
My thoughts exactly every time I feel so reluctant to finish some tasks I believe I could hold off for some time.
Naturally, I plan my work rigorously and despite the flexibility of my schedule and my detailed plans, I still have a hard time trying to nail the last coffin to my projects. Usually, it's due to my brain's exhaustion from overthinking or I am just truly tired physically. Which is a weird situation given I do not farm for a living. Even so, I was lethargic all the time.
But when I started running a month ago, things kind of fall into places for me. Maybe...just maybe...I've become more alert than I used to. I still have my ignorance of things that I believe do not concern my immediate attention but I seem to be able to network my thoughts faster than I used to.
It might be just me, feeling like a new person due to my sheer willpower to not burn my RM60 paid for the virtual run, but it did feel like there was a change.
For that, I managed to confirm what I have suspected all along - I am one of those people who love drills. I like things to be drilled into my head until I by-heart it into efficiency and then focus on polishing the effectiveness.
Thus...for coding, I committed myself to
freeCodeCamp
. By hook or by crook, I'll be coding by first quarter next year or someone's head is gonna roll!
It's an interactive learning experience simple enough for me to start, straightforward enough to not make me waste my time searching for answers and it's free. God bless Quincy Larson.
Going back to the program outlined in freeCodeCamp, I find it fascinating that they start off with HTML. I have no arguments there. My impatience made me learn my lesson - you run too fast, you're going to burn out painfully and drop dead before you halfway through. HTML is a very gentle introduction to coding for newbies since it's like LEGO building blocks where you arrange blocks and match two to create something. I didn't have to go crazy with frustration is I don't 'get' it. Yes, we would all want some Python lovin' and I think alot of coders I came to know have raved about how simple it is to learn. But I think, it is an opinion shared by 'experienced' coders who wished Python was there when they first started coding. Someone once told me, what you think is the best based on others' experiences may not be the best for you...and I agree with this. After alot of deliberations and patience at my end, starting over again this time feels, unlike the dreaded looming doom I've always had back then.
Are you into coding? What do you code and what's you're language preference? Where did you learn coding? Feel free to share with me!
Esri has been releasing more and more MOOC over the span of 2 years to accommodate its increasingly large expanse of products within the ArcGIS ecosystem.
But of all the MOOCs that I've participated in, 'Do-It-Yourself Geo App MOOC' must be the most underrated ones produced by Esri Training. The functionalities highlighted within the MOOC took the anthem right off their recent Esri UC 2020 that went virtual. The curriculum includes:
The creation of hosted feature layer (without utilizing any GIS software medium like ArcMap or ArcGIS Pro).
The basics of the ArcGIS Online platform ecosystem:
hosted feature layer > web map > web app
Basically, to view a hosted feature layer, you will need to drag it onto a 'Map' and save it as a web map.
Conventionally, web map suffices for the visualization and analytical work for the likes of any geospatialist who are familiar with Web GIS.
But this time, Esri is highlighting a brand new web map product called 'Map Viewer Beta'. Why beta? Cause it is still in beta version but so sleeky cool that they just had to let every have a shot at using it. Truth be told, Map Viewer Beta did not disappoint.
Even so, Map Viewer Beta still has some functionalities that have yet to be implemented.
Using web map to visualize data, configure pop-up, execute simple analysis and extending it to Map Viewer Beta interface
Utilizing Survey123 for crowdsourcing data; the first level of citizen science and creating a webmap out of it.
Creating native apps using AppStudio for ArcGIS; no coding required.
Some tidbits on accessing the ArcGIS API for JavaScript
I love how cool it is that this MOOC actually shows you step-by-step on how to use the new Map Viewer Beta and explain the hierarchy of formats for the published content in the ArcGIS Online platform
I have established my understanding of ArcGIS Online ecosystem 3 years back but I do find it awkward that such powerful information is not actually summarized in a way that is comprehensible for users that have every intention of delving into Web GIS. And Web GIS is the future with all the parallel servers that could handle the processing/analysis of large amount of data. ArcGIS Online is a simplified platform that provides interfaces for the fresh-eyed new geospatial professionals.
It is quite well-know for the fact that there has been some criticism as to the domination of Esri within the GIS tools/resources within the geospatial science industry, but I believe it is something we could take as a pinch of salt. Not everything in Esri's massive line of commercial products are superior to other platforms but it is a starting point for any new geospatialists who wants to explore technologies there are not familiar with.
All in all, this MOOC is heaven-sent. For me, I have been playing with the web apps and web maps for close to 4 years and I can attest to the fact that it covers all the basics. For the developer's bit, maybe not so much as going through it in a distinct step-by-step but it does stoke the curiosity as to how it works. The question is, how do we make it work. Now that's a mystery I am eager to solve.
I'm going to put this on my ever-expanding to-do list and think JavaScript for another few more months of testing out this ArcGIS API for JavaScript implementation. Tell me if you wanna know how this actually works and I'll share what I find out when I do.
For those who had missed out on this cohort, fear not. This MOOC runs twice a year and the next cohort is going to be from Feb 17 to March 17 2021. The registration is already open, so don’t hold back and click the link below:
Do-It-Yourself Geo Apps
Do register for a public account before signing up or just click 'Register' at the MOOC's page and it's open the open to either sign in or 'Create a public account'. It was a blast and I'm sure, if you've never used any of the feature I've mentioned above, you'll be as wide-eyed as I was 3 years ago. :D
Till then, stay spatially mappy comrades!
P/S: If you complete all the assignments and quizzes, you'll get a certificate of completion from Esri. Which is pretty rad!
There is a moment where base maps just couldn't or wouldn't cut it. And DEMs are not helping. The beautiful hillshade raster generated from the hillshade tool can't help it if the DEM isn't as crisp as you would want it to be. And to think that I've been hiding into hermitage to learn how to 'soften' and cook visual 'occlusion' to make maps look seamlessly smooth. Cartographers are the MUAs of the satellite image community.
I have always loved monochromatic maps where the visual is clean, the colors not harsh and easy for me to read. There was not much gig lately at work where map-making is concerned. The last one was back in April for some of our new strategy plans. So, when my pal wanted me to just 'edit' some maps she wanted to use, I can't stop myself with just changing the base map.
The result isn't as much as I'd like it to be but then, we are catering the population that actually uses this map. Inspired by the beautiful map produced by John M Nelson that he graciously presented at 2019 NACIS; An Absurdly Tall Hiking Map of the Appalachian Trail. What I found is absurd is how little views this presentation have. The simplicity of the map is personally spot-on for me. Similar to Daniel P. Huffman as he confessed in his NACIS 2018 talk; Mapping in Monochrome, I am in favor of monochromatic color scheme. I absolutely loathe chaotic map that looked like my niece's unicorn just barf the 70s color deco all across the screen. Maybe for practical purposes of differentiating values of an attribute is deemed justifiable but surely...we can do better than clashing orange, purple and green together, no?
So...a request to change some labels turn into a full-on make over. There are some things that I realized while making this map using ArcGIS Pro that I believe any ArcGIS Pro noob should know:
Sizing your symbols in Symbology should ideally be done in the Layout view. Trust me. It'll save you alot of time.
When making outlines of anything at all, consider using a tone or two lighter than the darkest of colors and make the line thinner than 1 pt.
Halo do matter for your labels or any textual elements of your map.
Sometimes, making borders for your map is justifiable goose chase. You don't particularly need it. Especially if the map is something you are going to compact together with articles or to be apart of a book etc.
Using blue all the way might have been something I preferred but they have the different zonations for the rivers, so that plan went out the window.
And speaking of window...the window for improvement in this map is as big as US and Europe combined.