Compare commits

...

722 commits
0.0.14 ... main

Author SHA1 Message Date
970cd568dc
Add generated jet tables and models
These will change with schema changes
2026-05-09 01:54:12 +00:00
935800e334
Switch to using the Gleipnir fork of jet
Because we need those sweet, sweet geometry columns
2026-05-09 01:48:56 +00:00
b5bc54b7f4
Start adding context resources to communications
These will contain URIs for anything related to the communication
2026-05-09 01:39:08 +00:00
0301545df9
Go back to replacing jet for now
I need it for my geometry column types
2026-05-09 01:38:36 +00:00
e36b512908
Stop ignoring any paths with 'nidus-sync' in them
Dumb oversight on my part
2026-05-09 01:38:07 +00:00
aa3c6e6209
Don't use custom jet, tidy module 2026-05-09 01:22:30 +00:00
93b69c4cbb
Remove dead query.go at project root
The QueryWriter interface and queryToString function had zero callers.
The commented-out insertQueryToString was a Bob remnant. The io import
was only used in this file.
2026-05-09 01:05:29 +00:00
7ffa2e891b
Remove dead esbuild build.js and flake.nix dependency
build.js was an esbuild-based build script from the pre-Vite era (March 2026).
It is not referenced by package.json, CI, or any config. Vite is used for
both sync and rmo builds. Also dropped pkgs.esbuild from flake.nix devShell.
2026-05-09 01:03:51 +00:00
be99baf64c
Remove unused tomtom/ integration
TomTom was added Feb 2026 for routing but was never imported outside
its own directory. Stadia Maps is now the geocoding and tile provider.
No references in go.mod, go.sum, or any Go file.
2026-05-09 01:01:49 +00:00
8592659432
Test of agent capabilities:
"I'd like you to take a look at the project nidus-sync. It has been developed over the past 6 months and had several different architectural approaches that have evolved over time. The git history may be especially useful to see the
evolution. Write up a report of what you understand has happend, with approximate timelines. This should go in HISTORY.md. Determine which changes are incomplete - libraries or approaches that are no longer preferred, but not fully
removed yet. Create another file, CLEANUP.md, which lists out cleanup
efforts that should be done to completely remove these older,
less-preferred parts of the code."
2026-05-09 00:54:39 +00:00
1b6fac3313
Distinguish between communication stubs and full resources
This is useful so I don't have to pull together the entirety of the log
for a communication list, which would be much more expensive.
2026-05-09 00:03:11 +00:00
01f35b603e
Add centralized error handler for sync Vue app 2026-05-08 23:33:49 +00:00
da90401b2d
Push location config to client
We'll let the default stay the default.
2026-05-08 22:48:51 +00:00
28cf7683a7
Pass-through an address shim with whatever data we have 2026-05-08 22:45:26 +00:00
d1ba2f53fa
Fix setting address on compliance reports
This error was subtle. First, we want to set the GID and raw content
directly using the updater instead of doing two round trips because we
can. Second, we want to do some geocoding if the address isn't already
in the system. Likely it is, because the frontend would have requested a
geocode, but it's possible that it isn't.
2026-05-08 22:43:57 +00:00
24a3610c4c
Correctly build updaters with New
Otherwise we have nil columns
2026-05-08 22:22:52 +00:00
7da653efc6
Avoid a DB query if there are no address IDs 2026-05-08 22:21:56 +00:00
735a9dc1d2
Properly close rows on empty results
I we don't do this we get "conn busy" errors.
2026-05-08 22:21:27 +00:00
f2585c569c
Woops, actually set all columns on compliance because it doesn't have a serial key 2026-05-08 01:08:06 +00:00
0fc46d5916
Only set mutable columns on insert
Because we don't want to set ID and other primary keys
2026-05-08 00:56:55 +00:00
61ad3fbe45
Remove string-based queries for public report data
Use the new jet hotness
2026-05-07 23:22:50 +00:00
12213fb31b
Remove string-only references to location_* generated columns 2026-05-07 17:01:54 +00:00
7a361a330d
Remove now-extraneous latitude/longitude generated columns
Now that we can pull out the geometry directly into a go object we don't
need these and they complicate our insertions
2026-05-07 16:38:42 +00:00
34a136eba5
Move user to compliance complete page for submitted reports 2026-05-07 16:17:00 +00:00
fcd95f1a25
Get back to compiling, but using new jet for publicreport
This was an epically long change, and a terrible idea, but it compiles.
This was essentially a cascade that came about because I can't blend jet
and bob in the same transaction. In for a penny, I guess...
2026-05-07 10:39:17 +00:00
a95e44cf42
Use transactions to set the communication status changes
Not doing it yet, but soon we'll do log entries for them.
2026-05-04 20:57:50 +00:00
040ab106b4
Fix failing to set timestamp in mark query
I accidentally didn't understand how this API works.
2026-05-04 20:30:36 +00:00
5f3fcc2b3e
Fix a bunch of not-checking-error lints 2026-05-04 20:29:02 +00:00
114aec73ed
Fix setting timestamp for when action is taken 2026-05-04 20:09:56 +00:00
60bf09e813
Avoid emitting error on transaction rollback that's complete
It's on purpose
2026-05-04 19:53:36 +00:00
347f62bd6d
Fix method on marking communications 2026-05-04 19:43:37 +00:00
878f0e9bcf
Try a different way to limit linting 2026-05-04 19:43:20 +00:00
18db17fe0b
Don't lint every go file on every commit.
Faster commits, less redundancy
2026-05-04 19:40:12 +00:00
b53c908b55
Fix warning from sentry setup 2026-05-04 19:39:17 +00:00
dc2fee3a9d
Fix selecting items in the communication list 2026-05-04 19:39:03 +00:00
387be40076
Return ID as a string from API
Because they are opaque, not something to math
2026-05-04 19:37:05 +00:00
3153e8bf13
Initial work marking communications
And a bunch of lint fixes
2026-05-04 19:07:29 +00:00
67c99436d1
Properly set submitted on PUT, return new status properties on comms 2026-05-02 00:41:31 +00:00
431435f8bd
Set the organization on inserted communications 2026-05-02 00:38:38 +00:00
57dc2023cd
Remove unused submit function 2026-05-02 00:38:12 +00:00
d6b664d84a
Return the communication we create
...or else it'll be empty.
2026-05-02 00:37:51 +00:00
52d4c47e43
Fix lint errors related to not checking errors 2026-05-02 00:37:28 +00:00
7f71ff9a2e
Send submit PUT on compliance report flow, create communication then
This makes it so that people don't see compliance reports as they're
being formulated in the communication workbench
2026-05-01 21:27:17 +00:00
a82732a49c
Return communication database rows from communication API
This is a pretty big refactor of how communication works to start moving
us in the direction we want to go long-term. This adds the new
communication row and migrates existing reports to add rows for
communication.

There's also a bunch of automatic fixes from the new linter. I should
have added them separately, but whatever.
2026-05-01 21:00:23 +00:00
a6ce0b7e67
Add golangci-lint to lefthook workflow 2026-05-01 17:36:32 +00:00
bab3200b6c
Port all of the arcgis schema to using jet
Have not tested anything at this point, it just compiles.
2026-05-01 17:28:33 +00:00
89ed2003fa
Ignore custom jet schema binary 2026-05-01 15:13:50 +00:00
a82b2b8cb8
Add new communication table
It allows us to track when communication tasks are complete, and
information about how they were completed, separate from the entries
that created the tasks in the first place (reports, emails, texts)
2026-05-01 15:13:05 +00:00
6a47302192
Create custom jet template for both stadia and arcgis 2026-05-01 15:11:20 +00:00
0e0b2489e6
Get beginnings of custom column type working 2026-05-01 06:27:26 +00:00
9ef4dad27c
Initial custom jet generator
I'll need it for Postgis data types
2026-05-01 05:45:45 +00:00
e5a84e09a8
Initial working version of using jet for SQL building 2026-05-01 05:11:28 +00:00
4bd62b3567
Fix publicreport store name pollution
This was causing a request to be made to the wrong API endpoint by going
to /api/publicreport instead of /api/rm/publicreport which doesn't work
on RMO's hostname.
2026-05-01 02:37:54 +00:00
503cde6063
Init sentry first, then mount the app
Gets rid of a warning from the Sentry SDK
2026-05-01 01:52:44 +00:00
8757f1cda3
Fix loading on status page
It was infinite looping in the computed value for report
2026-05-01 01:52:11 +00:00
ace2557a60
Fix navigation from RMO status page on report table 2026-05-01 01:43:57 +00:00
537d5c9133
Save address input if the user clicks the "back" button. 2026-04-30 16:46:28 +00:00
00d26a684a
Handle EXIF location data set to "NaN"
Probably Android's new privacy thing. Jerks.
2026-04-30 15:34:08 +00:00
1cfe51f894
Actually check the error state for saving an image 2026-04-30 14:27:45 +00:00
43edca7093
Add new response template 2026-04-30 13:58:31 +00:00
839ed138ca
Ignore vite deps 2026-04-30 03:48:53 +00:00
797067ee38
Refuse to send compliance letters to addresses without a postal code 2026-04-30 03:09:42 +00:00
e45e05f337
Copy vite build output to frontend in nix package
Then we can upload the symbols when we run
2026-04-30 03:09:06 +00:00
b8a9ecb253
Fix warning from vite about concern having multiple root entities 2026-04-30 00:07:37 +00:00
0c464a9963
Get working sentry for the UI
Previously it almost, but didn't quite work. Now it actually works, but
the stack traces are minified.
2026-04-29 23:59:34 +00:00
2f6cbe59eb
Add root API to RMO api
For getting sentry integration information
2026-04-29 23:58:49 +00:00
33ecfce313
Use RMO publicreport store in compliance and district report creation 2026-04-29 22:36:33 +00:00
9229725300
Consistently show loading state on compliance flow button 2026-04-29 22:27:08 +00:00
89eca2ddf9
One more attempt to handle report null-ness 2026-04-29 20:58:47 +00:00
a1b2d580a8
Return nil through on by id compliance 2026-04-29 20:37:36 +00:00
c6cb645453
Don't error out on missing report 2026-04-29 20:24:41 +00:00
7e79308868
Don't return an error when report doesn't exist 2026-04-29 20:18:36 +00:00
da75aeecf2
Allow multiple posts of address report
Since this is the first step of scanning the mailer.
2026-04-29 19:30:43 +00:00
af39a73e8f
Add address raw content to report
This populates the address in the compliance flow UI
2026-04-29 19:30:38 +00:00
53ce100859
Fix copy-paste error on mailer store 2026-04-29 19:17:20 +00:00
364b4ddc32
Remove click timeout on MapLocator
This was added to try to fix scrolling passed the map on phones.
Instead, it just confuses click-and-drag. Instead we rely on the
lock/unlock overlay for the map to make scrolling passed work.
2026-04-29 15:58:57 +00:00
06fda0554c
Fix link to report status 2026-04-29 15:55:27 +00:00
0375f666d6
Remove detailed identification guide link
It didn't go anywhere.
2026-04-29 15:54:31 +00:00
aefb5ec6bd
Fix tooltips on water report page 2026-04-29 15:37:38 +00:00
0a0e6f6301
Prevent nuisance and water form submission on enter 2026-04-29 15:31:04 +00:00
ab5dd13fcb
Make "Submit Anothe Report" go to root so they can choose the type 2026-04-29 15:21:54 +00:00
d67c54c6e7
Set authenticated after sigin completes without error 2026-04-29 15:04:44 +00:00
4bb37c5ab3
Reconnect SSE event stream after shutdown
Otherwise we'll never know we have updates
2026-04-29 15:02:18 +00:00
2fbceb11e3
Don't bail on district match early, check address
This is the other half of doing proper district match via raw address -
we have to use the address if available for looking up a district.
2026-04-29 15:01:35 +00:00
524353bfa1
Geocode address if we only have a raw value
This will help with matching when the user does not select a suggested
address.
2026-04-29 13:55:10 +00:00
822dad5352
Don't require systemd sockets in dev mode
Because otherwise I can't run the program on my dev server
2026-04-29 13:54:48 +00:00
5ac778ea53
Make the shutdown event a status message
That way our status handlers on the frontend will know what additional
data is available
2026-04-29 13:54:17 +00:00
f3af19f03a
Add systemd activation sockets for downtime-free deploys 2026-04-28 23:24:19 +00:00
bd3d3881f5
Increase server shutdown timeout
We shouldn't see this unless we fail to get everything closed down.
2026-04-28 22:37:43 +00:00
a17544bb4b
Downgrade errors on server shutdown 2026-04-28 22:25:31 +00:00
a101ff3cc9
Suppress errors from canceled context on DB notification goroutine 2026-04-28 22:24:15 +00:00
a5b9ee0c6c
Actually check error code on DB query for audio post 2026-04-28 22:18:41 +00:00
4a90917645
Add more detail to address creation failure 2026-04-28 22:16:22 +00:00
d5d6201177
Improve error response from Lob integration 2026-04-28 22:14:05 +00:00
309f8fe2c5
Downgrade failure to get admin info to warning
To clear out Glitchtip a bit
2026-04-28 22:10:39 +00:00
bf6b5dcb17
Fix privacy policy render 2026-04-28 21:42:48 +00:00
4ed005fb37
Move map as bounds change on communication workspace 2026-04-28 20:37:38 +00:00
4fcb184286
Make communication map location reactive
Prevents stateful errors like failing to update bounds when the API
fetches a new value for a report
2026-04-28 20:31:40 +00:00
2911c7b215
Add a parameter to track which communication is selected 2026-04-28 20:09:13 +00:00
ff668c223b
Add update notification when version changes 2026-04-28 17:09:43 +00:00
52c41e29d8
Distinguish between status messages and resource messages in SSE 2026-04-28 17:06:21 +00:00
38359e20e9
Use auto build version info for embedding version information
This is better, integrates with git, gives us more detail, and I don't
have to explicitly pass it around everywhere.
2026-04-28 16:36:48 +00:00
20bf272746
Clean up communication list display 2026-04-28 15:30:15 +00:00
060d0dd95f
Add compliance card detail information display 2026-04-28 15:22:20 +00:00
72626e8dd0
Fix incorrectly passing public status to platform for nuisance 2026-04-28 14:56:06 +00:00
b68d93ec91
Load communication reports asynchronously
This solves some problems created by making the publicreport part of the
communication API consistent. There are a lot of optimizations still on
the table with this one, but for now I need to get this out.
2026-04-28 14:49:02 +00:00
6e3d079c46
Immediately mark session authenticated on successful signin 2026-04-28 14:37:33 +00:00
878b43c0a6
Fix text log logo for outgoing 2026-04-28 07:45:51 +00:00
175fd8d0fb
Fix water public ByIDGet uri generation 2026-04-28 07:34:54 +00:00
dba8b6c475
Consistently use the correct public URI for public reports 2026-04-28 07:12:12 +00:00
32a0d895c4
Fix missing RMO report store 2026-04-28 06:54:16 +00:00
4ae0410930
Fix various inter-linkings of public report paths 2026-04-28 06:53:58 +00:00
8bdd18649d
Separate out a public and non-public halves to publicreport APIs
This prevents us from leaking text messaging details on public
endpoints.
2026-04-28 06:36:55 +00:00
8fcd926d43
Format report IDs with hyphens 2026-04-28 05:33:53 +00:00
68adab88bc
Re-add notifications registered page 2026-04-28 05:30:08 +00:00
82b313f62f
Add text message log to report display 2026-04-28 05:15:01 +00:00
4ce91d77d4
Add text message history to acitivity log 2026-04-28 01:12:18 +00:00
6350aa00d5
Populate report URI and district on communication list 2026-04-27 23:15:33 +00:00
909665ab6c
Use new map system for communication pane, fix location markers 2026-04-27 20:53:46 +00:00
1dba58472b
Add support to communication list view for compliance entries 2026-04-27 20:06:44 +00:00
9c392e5791
Make public report card name consistent with other components 2026-04-27 19:53:22 +00:00
63ebe382b6
Properly convert communication API objects
Gets rid of a warning when showing relative time.
2026-04-27 19:50:47 +00:00
a2b8527d91
Track user location with map and address data
This is useful because everywhere that we use the AddressAndMapLocator
component we also want to use the user's location and we want to zoom
the map based on their location. Instead of tracking this externally in
3 places we just pull it into the component.
2026-04-27 19:44:25 +00:00
3867737fcc
Track camera changes during map load
This is necessary so that we can frame the map at any time in client
code, like with the user's location data, and still end up with the
correct location.
2026-04-27 19:44:25 +00:00
937953f2a2
Produce a raw address value from geocode requests
This makes it so that the frontend doesn't have to calculate what to
display
2026-04-27 19:44:25 +00:00
96498c01bf
Add API for getting just the closest reverse geocoded answer
Because we don't care about anything that is nearby when the user clicks
on the map, we just want the closest thing.
2026-04-27 19:44:25 +00:00
b92697b8c8
Remove erroneous "MapCell' component
Eliminates a warning in the build step
2026-04-27 16:23:46 +00:00
ffe427564b
Add email and phone display to communications workbench 2026-04-27 16:23:31 +00:00
be8d92d7ae
Add 'submitted' field to compliance reports 2026-04-27 16:23:16 +00:00
8a05ba2faf
Initial reimplementation in VueJS of address or report suggestion 2026-04-25 00:17:35 +00:00
c783ab7942
Add report rendering table to status page 2026-04-24 23:06:07 +00:00
5e638bdf1d
Remove hidden water inputs, add missing duration input 2026-04-24 22:23:52 +00:00
203d2014b0
Show map with nuisance and water on status page
Leverages the new declarative map logic. Still missing a bunch of
features
2026-04-24 22:20:01 +00:00
3bfcfff1eb
Navigate to cell detail page on cell click 2026-04-24 13:48:00 +00:00
e5080eaaf6
Add click event for cells on the dash map 2026-04-24 13:23:03 +00:00
a88aa4c8a0
Change cursor when the user hovers over a layer 2026-04-24 00:36:18 +00:00
6992031007
Add callback for when the mouse enters or leaves a layer 2026-04-24 00:31:03 +00:00
c1a8249dc0
Don't remove source and layer on unregister
I'm not convinced this is necessary since we are freeing the map itself
and its causing a crash on navigation away.
2026-04-23 23:47:16 +00:00
77283b3654
Add parcel data to overview heatmap 2026-04-23 23:46:31 +00:00
f1e4aca9b8
Set bounds to default if the district doesn't have a service area 2026-04-23 23:38:12 +00:00
963254973b
Move map-utils into map/
Namespacing matters
2026-04-23 23:31:50 +00:00
cad01e689e
Initial working genericized map implementation
This shows dynamically adding layers and sources and actually reads from
them!
2026-04-23 23:02:53 +00:00
c6282c9f5e
Default to setting AddressGid
So we don't run afoul of the nullable constraint
2026-04-23 22:41:22 +00:00
c8989237b0
Fix reference to address number. Again. 2026-04-23 21:39:13 +00:00
516dd6f429
Don't show "send compliance mailer" if org isn't configured for Lob 2026-04-23 15:50:58 +00:00
72a8ed5c16
Improve signin messaging 2026-04-23 15:24:06 +00:00
b4e6bac566
Fix reference to number_ column 2026-04-23 13:53:22 +00:00
cc59ccb9b5
Fix reference to address number column 2026-04-23 00:34:51 +00:00
1ddba5ebb1
Fix qr code generator functions 2026-04-23 00:30:42 +00:00
582aa952e4
Remove template test 2026-04-23 00:30:35 +00:00
10dc5c0bd7
Move qr-code generation to the API 2026-04-23 00:28:31 +00:00
7be8b428e4
Remove remaining sync mocks 2026-04-22 23:02:21 +00:00
1d266c88c1
Fix initial view of markers on map load
The issue here was that "fitBounds" doesn't work before the map is
loaded, we have to use the map constructor to set the location.
Therefore it makes no sense to even attempt these operations internally
before loading.
2026-04-22 22:43:16 +00:00
5caa9d8c7a
Populate address if we have enough data on compliance address form 2026-04-22 22:42:26 +00:00
b0170b20d5
Update fetching address number to match new types.Address pattern
This matches what we get by using the models column definition directly.
2026-04-22 22:20:42 +00:00
f24a583e2e
Add beginning of cell detail page 2026-04-22 22:19:53 +00:00
23819961e6
Populate compliance address based on site location 2026-04-22 21:22:33 +00:00
a8819c907e
Add concern page to mailer compliance flow 2026-04-22 21:22:03 +00:00
b5923137a7
Set organization (district) for compliance reports from mailer 2026-04-22 19:54:06 +00:00
78458760ec
Navigate to cell on aggregate map click 2026-04-22 15:46:02 +00:00
1286d0ea2a
Fix organization ID for aggregate map 2026-04-22 15:40:42 +00:00
2fbcb9f918
Add next value to signin page and actually use it 2026-04-22 15:30:24 +00:00
5cdbc4eb53
Fix links in the compliance process 2026-04-22 14:49:04 +00:00
b4527fba8b
Develop patterns for creating links outside router 2026-04-22 14:33:56 +00:00
bcd51cf5cf
Fix compliance query again.
Blarg.
2026-04-22 00:22:51 +00:00
8a9a3e8c0c
Update pnpm deps hash from sentry update 2026-04-22 00:05:27 +00:00
986d12eab2
Initialize sentry after getting API status 2026-04-21 23:58:04 +00:00
839abcbd28
Mave frontend data to base api root
Because many times we don't have a session
2026-04-21 23:53:42 +00:00
544ac78a3b
Add frontend configuration to session for env, sentry, version 2026-04-21 23:44:59 +00:00
8d37e8fab5
Fix compliance query
I can't use this until I fix some bugs in bob :(
2026-04-21 23:36:29 +00:00
2b30411c1b
Add sentry integration with Vue frontend 2026-04-21 23:35:59 +00:00
baaa3bff5b
Make request parser handle form-encoded content
This fixes a new signin bug
2026-04-21 22:48:31 +00:00
0ce3420792
Save all lob events to the database
They're pretty raw, but this will help us to understand what we can
collect
2026-04-21 22:24:12 +00:00
4db1a6f678
Add support for data fields for letter.created 2026-04-21 22:11:53 +00:00
f24104dc94
Update lob hook to handle both address created and letter billed payloads
Seems we'll have a lot of optional values
2026-04-21 22:00:09 +00:00
ee9a355613
Serialize nil slices as empty slices 2026-04-21 21:55:48 +00:00
810a13cee0
Add initial lob hook receiver 2026-04-21 21:55:37 +00:00
fe2041f22b
Add an evidence field to compliance reports
This allows us to show a page with information about what the district
is concerned about when asking the user to fill a report.
2026-04-21 21:35:40 +00:00
a0ac5c0674
Add debug logs around authentication
Trying to troubleshoot our redirection logic after signin
2026-04-21 19:40:00 +00:00
bcc5151116
Don't compliance report on root Compliance page
We're now doing that through our two entrypoint pages.
2026-04-21 19:39:18 +00:00
8fd86d478c
Update mailer page to show actual data 2026-04-21 19:38:46 +00:00
0b005c3e76
Add debug logs around exiting goroutines
I'm debugging our clean shutdown
2026-04-21 19:37:58 +00:00
4a214b099e
Disallow login or sessions from inactive users 2026-04-21 19:37:26 +00:00
eb27af7d90
Add mailer API and initial mailer view 2026-04-21 19:19:59 +00:00
0d8d7f3aeb
Add link for reviewing mailers 2026-04-21 15:01:46 +00:00
80031c1d1a
Make response to compliance report creation consistent 2026-04-21 15:01:01 +00:00
bcea3c6bdf
Gracefully exit listenForJobs when context ends 2026-04-21 14:59:52 +00:00
bad50a8772
Clean up compliance report creators and share UI 2026-04-21 14:45:11 +00:00
bd3e42f83e
Use the same create logic for Mailer report creation 2026-04-21 14:41:06 +00:00
f927b0a911
Split out ComplianceDistrict view for creating new compliance reports
The idea here is that we'll make compliance reports two different ways,
The first is if the user navigates to /district/:slug/compliance, the
second if they open a QR code from a mailer. In both cases we create the
report then feed them into a flow for updating the data on that report.
2026-04-21 14:35:13 +00:00
8eae73eefb
Add initial compliance mailer page
It loads at this point. Woot.
2026-04-20 23:16:57 +00:00
5d510915d2
Add version to frontend connection 2026-04-20 22:42:21 +00:00
e2d4f917a0
Add script for running output of "nix build" 2026-04-20 22:41:55 +00:00
2a3dbbdad3
Show login error on failure 2026-04-20 22:34:39 +00:00
7a6cffa74c
Cleanup unused variables 2026-04-20 22:34:24 +00:00
e929118349
Reduce log spam on user login error 2026-04-20 22:34:10 +00:00
aae0d1ed74
Log version on startup 2026-04-20 22:33:56 +00:00
8387cf667b
Add company filter to Lob list addresses
...even though I never made it actually work.
2026-04-20 22:33:20 +00:00
ffd424df12
Save the organization with the compliance report on creation
This avoids the problem of having to assign the compliance report later
when we get location data and image data.
2026-04-20 16:21:08 +00:00
0b32492fd6
Add version information to build output 2026-04-20 01:58:44 +00:00
ade629ecf5
Rework background jobs to make transactions much shorter
I ended up with minutes-long open transactions in the database in prod
which was causing outtages. This is because I thought transactions were
basically free, which is a terrible thing to think. Instead we'll just
open them when we need them.
2026-04-17 22:53:23 +00:00
55cb4ca962
Track the site with the URL 2026-04-17 22:04:24 +00:00
efd6f59fca
Populate ComplianceReportRequest on site review page 2026-04-17 21:40:04 +00:00
a6ca30fdb1
Add application name to transaction
Trying to find what's getting locked
2026-04-17 21:27:30 +00:00
3196b73a80
fix setting up compliance map 2026-04-17 20:58:21 +00:00
5a865cc5e1
arcgis-go bump 2026-04-17 20:55:11 +00:00
abbe80b1f0
Don't fail to process background jobs because one failed 2026-04-17 20:51:24 +00:00
ac552be7e7
Send compliance report data with lead data 2026-04-17 20:51:07 +00:00
cedbb3372e
Try to capture more data on the failure to create address with Lob 2026-04-17 20:50:43 +00:00
0420b777c9
Remove chatty log 2026-04-17 20:50:30 +00:00
83bf3023de
Update vendor has for arcgis-go update 2026-04-17 20:26:55 +00:00
0e777568fb
Add sublogging for job work for debugging 2026-04-17 20:25:25 +00:00
75e9d5a621
Bump arcgis-go version to 0.0.12 2026-04-17 20:25:06 +00:00
a2cdbc26bd
Allow signin with next parameter 2026-04-17 19:44:08 +00:00
be9065354d
Detect when we fail to get tile service 2026-04-17 19:43:57 +00:00
fa675f293d
Add initial work on getting compliance data for leads 2026-04-17 19:43:40 +00:00
b7d26d5ad7
Only log every route if we have VERBOSE enabled 2026-04-17 19:39:10 +00:00
21587493c0
Stop swamping the server on reboot 2026-04-17 18:36:05 +00:00
4625dd39d0
Default sort sites by created date 2026-04-17 18:25:04 +00:00
fd662721bb
Fix non-rolled-back transactions 2026-04-17 18:19:13 +00:00
4a8c0d2e60
defer rollback rather than guard returns
I'm trying to make sure we close transactions on the database
2026-04-17 18:00:26 +00:00
c938cb231e
Add org name and user name to dashboard 2026-04-17 17:51:02 +00:00
ba8c0016ac
Add sigup page...again
Had it previously, but broke it for the single-page app migration.
2026-04-17 17:48:18 +00:00
b6e1bffd79
Add support for satellite tiles, with caching 2026-04-17 17:47:38 +00:00
61351dabf1
Update UI when a compliance letter is sent 2026-04-17 15:20:17 +00:00
efa01cffc2
Move session management into session store
Trying to get rid of the redirect to signin on any page refresh
2026-04-17 14:52:02 +00:00
bf156eaf7f
Slim down chrome requirements for the server 2026-04-17 14:51:24 +00:00
cb92b845e6
Show sites with leads more clearly 2026-04-17 03:10:08 +00:00
b85deb229f
Improve padding for multiple marker points 2026-04-17 03:07:16 +00:00
bff81eb6e3
Add basic lead type 2026-04-17 02:59:22 +00:00
617631063f
Add quick'n'dirty interface for leads and features 2026-04-17 02:59:01 +00:00
1f8e6b698f
Allow for a lot more sites, and for scrolling 2026-04-17 02:40:05 +00:00
09fe773987
Use the correct URL for generating pdfs 2026-04-17 02:31:57 +00:00
273e2b7b32
Ignore new lob test utils 2026-04-17 02:23:28 +00:00
dc3cce0b8a
Switch to multipart upload of PDF to lob, move backend to using that. 2026-04-17 02:22:05 +00:00
2c0aa980e7
Working creation of a letter
I had to use an address ID for 'to' and 'from' and then did really
weak-sauce inline HTML.
2026-04-17 01:00:16 +00:00
3ab0a00959
Working path to create addresses in lob 2026-04-17 00:23:34 +00:00
2ddf015a68
Skip pools without an address 2026-04-17 00:11:47 +00:00
b7eff027e7
Parse out lob's address format 2026-04-17 00:02:10 +00:00
fde9539191
Don't attempt to find the parcel if our address ID is nil
That's because the address doesn't have location data, so we can't find
a parcel.
2026-04-16 23:54:18 +00:00
6945b9f9ed
Drop to a single worker in the geocode pool
I'm sharing transactions incorrectly, and until I fix that I need
correctness, not speed.
2026-04-16 23:50:19 +00:00
5100c8f0be
Stop redirecting all loads to the dash page 2026-04-16 22:51:15 +00:00
1aba99f732
Remove chatty debug logs 2026-04-16 22:51:00 +00:00
97ec0667a5
Initial success talking to lob directly with my own client 2026-04-16 22:41:43 +00:00
c0935c848b
Default required fields to empty strings
So the insertion doesn't fail
2026-04-16 22:32:48 +00:00
a2271a2ce8
Add more debug info about user login 2026-04-16 22:08:28 +00:00
83c013785f
Fix swizzled email args 2026-04-16 21:58:41 +00:00
e464a9fcdb
Longer timeout on axios client
We're hitting the 10sec timeout trying to do login
2026-04-16 21:24:45 +00:00
59bf360937
Try to get more debug info on lob 420 error 2026-04-16 21:00:24 +00:00
a33056039a
Update vendor hash 2026-04-16 21:00:13 +00:00
f1890332ae
Get proper cached images from the tile server 2026-04-16 20:51:17 +00:00
b3a89d9c68
More protections for oauth being expired 2026-04-16 20:43:40 +00:00
a922196f20
Add mailer file utilities 2026-04-16 20:41:13 +00:00
89a2cb30e6
Save the type from the database on feature 2026-04-16 20:40:55 +00:00
1bc452bc09
Avoid crashing when getting Fieldseeker client 2026-04-16 20:40:37 +00:00
ee2254281c
Fix erasing feature locations 2026-04-16 20:39:30 +00:00
59755a0b42
ignore new tile-raster helper command 2026-04-16 20:38:19 +00:00
d03c12ffb6
Add working ability to get stadia tiles directly 2026-04-16 20:37:49 +00:00
163b0f9edc
Destroy the session on signout
Kill it with fire
2026-04-16 19:50:23 +00:00
a6f9396760
Add first draft of mailer integration
This adds a bunch of stuff, including setting the organization's Lob
sender address ID, inserting mailer/compliance_report relationships,
adding external id from Lob (or maybe some other provider) and
attempting to load up the pool feature for a site.
2026-04-16 19:49:18 +00:00
84da2bdc7d
Show full address on site page 2026-04-16 18:55:47 +00:00
ed6dde2f0a
Fix navigation after login 2026-04-16 18:50:34 +00:00
3379251ccb
Fix missing AddressAndMapLocator import for RMO water 2026-04-16 18:03:40 +00:00
7483a6a695
Redirect to signin on session failure 2026-04-16 17:30:54 +00:00
d047c460ed
Add missing files in last two commits 2026-04-16 17:15:44 +00:00
81e057b900
Add initial work for backgrounding mailer job 2026-04-16 17:15:20 +00:00
b6d1bd9ee2
Create sign-in and sign-out workflow in SPA 2026-04-16 17:14:57 +00:00
08a1b5b81d
Don't send www-authenticate to well-behaved browsers
Had to make a special case for EventSource on the browser via the
accepts header. This prevents the browser from doing a login window so
we can show them the nice login page.
2026-04-16 16:01:45 +00:00
7b95cfe833
Actually commit the transaction 2026-04-16 10:49:06 +00:00
6b90edf053
Fix creation of signal for mailer 2026-04-16 10:47:13 +00:00
f444bf39fb
Update hashes for deployment 2026-04-16 10:46:55 +00:00
2ea47f03f4
Start wiring together request for a mailer to database 2026-04-16 10:15:28 +00:00
74e24b7de3
Add feature to site data 2026-04-16 09:04:25 +00:00
5a35c1d1f8
Show parcel information on site page 2026-04-16 08:26:48 +00:00
5d06afbecc
Add maps for site review 2026-04-16 08:09:03 +00:00
8514ec36d5
Allow for selecting sites 2026-04-16 07:55:08 +00:00
35ab261ee8
Add missing site store 2026-04-16 07:43:53 +00:00
838e24bbed
Stop losing webGL context on review complete
It makes things *much* faster
2026-04-16 07:43:17 +00:00
84604dfdc8
Fix reference to site owner 2026-04-16 07:25:06 +00:00
d86ef13345
Flesh out the start of the site list 2026-04-16 07:20:53 +00:00
b9c257a635
Add site contact information 2026-04-16 07:12:34 +00:00
671397ba81
Make a home for reviewing sites 2026-04-16 07:03:45 +00:00
c4c22f6733
Start to populate site information in review task 2026-04-16 06:58:05 +00:00
03dccb638a
Start adding support for lob 2026-04-16 06:57:20 +00:00
e3f9a19b84
Allow deselecting review tasks
Makes it so I can test the map losing gl context
2026-04-16 05:36:54 +00:00
262aa009c2
Avoid losing map context when selecting tasks 2026-04-16 05:36:28 +00:00
74e0630a41
Revert proxied tile to not use layer control
It's disabling the important layer.
2026-04-16 05:14:27 +00:00
29c3b267d9
Add pieces of initial site review page 2026-04-16 04:48:07 +00:00
e1f3c93a1d
Make it possible to click on either map to choose a pool 2026-04-16 04:47:41 +00:00
259960cf45
Show row number in UI for pool uploads 2026-04-16 04:21:32 +00:00
d395699dc4
New algorithm for detecting multiple conflicting features 2026-04-16 04:21:11 +00:00
f490e4a1a4
Avoid duplicate rownumber calculation
We have the row number saved on the pool object itself, which is how we
gather errors against rows.
2026-04-16 04:20:28 +00:00
d352b0d932
Get pool rows by line number so they stay in order
Because otherwise the errors don't line up correctly.
2026-04-16 04:01:44 +00:00
a9077b6c36
Fix framing locations on the map display 2026-04-16 03:46:56 +00:00
5f68eb453f
Add placeholder for when we fail to extract address data from a feature
This bit me recently when getting the number from an address
2026-04-16 03:29:08 +00:00
5e0981e2a2
fix bad copy paste on address field 2026-04-16 03:16:57 +00:00
b2d8e3ba27
Move address list func to types so it can be shared with csv
And stop double-geocoding all the rows.
2026-04-16 03:06:18 +00:00
5bf93c3dfd
Fix erroneosly showing error marking on good rows 2026-04-16 03:05:31 +00:00
b2c5bb6735
Show map on upload detail page
Bunch of stuff still doesn't work right.
2026-04-16 02:48:12 +00:00
171672ee33
Fix minor error on upload detail page rendering 2026-04-16 02:47:29 +00:00
aa5a35b15f
Create option to use satellite imagery
Useful for looking at pools
2026-04-16 02:47:11 +00:00
82dd5e8683
More debugging for CSV import 2026-04-16 02:46:55 +00:00
f5ac7bb4ee
Set address form pool rows using address model if possible 2026-04-16 02:46:24 +00:00
e894ae28dc
Add initial site list resource 2026-04-16 02:45:48 +00:00
d55a7ec5af
Add initial sort-of-working layer selector 2026-04-16 01:47:33 +00:00
0e165b57d0
Default header to tag type 2026-04-16 00:18:42 +00:00
dfe7d3650f
Default header type to unknown
This is a subtle bug from the zero value of a header enum that's causing
overwriting in pool uploads
2026-04-16 00:14:35 +00:00
b6951d64d4
Remove api_key from URL to stadia cache
It's redundant and a security risk
2026-04-16 00:09:18 +00:00
ee38d0d2b6
Add fake operations map component 2026-04-15 23:45:05 +00:00
ac27c60e0c
Save address IDs when doing pool geocoding 2026-04-15 20:29:42 +00:00
6a8ae6d81a
Exit the geocode job if we hit an error 2026-04-15 19:31:55 +00:00
87c802fa90
Fix relationship for looking up whether the pool is in the district 2026-04-15 19:31:32 +00:00
b08582224a
Add missing required state header
I was incorrectly mapping "city" to "region" previously. A 'region'
actually is closer to a state. We need a locality, which is closer to a
city.
2026-04-15 19:30:50 +00:00
66d35428fa
Add error display to file upload 2026-04-15 19:02:25 +00:00
344f4bcaa5
Fix redirect after discard 2026-04-15 18:34:43 +00:00
ac65129ba6
Fix ability to discard upload 2026-04-15 18:32:28 +00:00
322be2fe40
Fix redirect on CSV upload 2026-04-15 18:32:19 +00:00
1097004245
Add custom pool upload page 2026-04-15 18:25:38 +00:00
f4d0ce015d
Factor upload requirements out into parent component 2026-04-15 17:36:33 +00:00
adcff5c5c8
Split upload requirements table into its own component 2026-04-15 17:27:22 +00:00
388801fd09
Fix upload links for pools 2026-04-15 17:24:34 +00:00
cd98751667
Avoid unloading the maps on pool review
It's expensive and slow
2026-04-15 17:23:26 +00:00
b0e2e97f09
Make changes actually reflect the changes 2026-04-15 16:59:53 +00:00
239340a7a9
Allow for selecting new location for a pool 2026-04-15 16:50:43 +00:00
31d8d2d0d5
Populate pool review form in the parent view
This will allow us to change it.
2026-04-15 16:42:16 +00:00
05ec6798ac
Get markers to show up on maps in pool review page 2026-04-15 16:22:08 +00:00
5549f9d79f
Wire up logic for completing and discarding a review 2026-04-15 14:11:48 +00:00
0fbf891c23
Show filler until a review task is selected 2026-04-15 14:07:43 +00:00
9ea99c92f9
Get overview map working on review details page 2026-04-15 00:12:19 +00:00
8ebcff7390
Fix oauth callback for arcgis to be under oauth prefix
That way it gets through the Vite proxy.
2026-04-14 23:43:53 +00:00
659df00cc9
Allow refreshing the oauth token in the frontend 2026-04-14 23:41:40 +00:00
5451c297c2
Harmonize review page properties between front and back ends 2026-04-14 23:29:29 +00:00
b09725726c
Create API for service requests list 2026-04-14 23:06:50 +00:00
4a440e3022
Add a resource for getting service requests 2026-04-14 19:59:32 +00:00
28ec1c3d67
Get latest syncs from the API 2026-04-14 19:21:51 +00:00
347e8dcb86
Update geocode store to use new naming pattern 2026-04-14 18:40:54 +00:00
67dcb87b81
Add missing water report detail component 2026-04-14 16:31:08 +00:00
b849eec7ea
Restore status page for standing water report 2026-04-14 16:28:58 +00:00
ebbd79ed7e
Fix status display of type-specific report details 2026-04-14 16:20:44 +00:00
fe41df3e16
Make publicreport by ID base redirect to detailed information 2026-04-14 16:07:17 +00:00
4a28a16639
Show address location, not reporter location, in the status page 2026-04-14 15:46:52 +00:00
59e58840c9
Fix address lat/lng location names, populate in report response 2026-04-14 15:43:49 +00:00
7e2a22c58c
Mave report ByID to their own resources 2026-04-14 15:31:10 +00:00
9b1de15373
Fix public report ID for water reports 2026-04-14 15:28:41 +00:00
c84a2ef42b
Properly set public ID 2026-04-14 15:24:21 +00:00
5527731e83
Add can SMS question, fix error handling of client ID 2026-04-14 15:15:19 +00:00
84db38c985
Include client ID in nuisance and water reports 2026-04-14 14:50:28 +00:00
a23866619d
Start saving client ID on compliance reports 2026-04-14 14:38:22 +00:00
7545b2e4ef
Update the report after uploading images 2026-04-14 02:44:10 +00:00
5448702a7d
Fully populate report after PUT
Otherwise we miss stuff like the number of images
2026-04-14 02:38:36 +00:00
2408bcbeff
Save public_id after creating report so we see it in the UI 2026-04-14 02:35:41 +00:00
e707d91e7f
Create images on the correct URI 2026-04-14 02:35:04 +00:00
e2b6bc6502
Use JSON POST for creating the compliance report 2026-04-14 02:33:39 +00:00
b805374a6c
Fix not panning to location when map is loaded 2026-04-14 02:02:34 +00:00
6e1a5b4348
Avoid jumping to empty camera location 2026-04-14 01:45:04 +00:00
cadf6afb5f
Use embedded address location rather than external location on geocode 2026-04-14 01:42:53 +00:00
3c62fe2ca1
Be consistent about using report.public_id over report.id 2026-04-14 01:26:23 +00:00
02139450c6
Always set reporter phone can SMS 2026-04-14 01:21:50 +00:00
3ae72c8944
Check for address before inserting a new one. 2026-04-14 01:20:52 +00:00
a189348b36
Remove existing report URI when submission completes 2026-04-13 23:51:43 +00:00
8f494991e2
Make final pages show real data 2026-04-13 22:46:43 +00:00
f74d2c3ca1
Show phone and email as present if they are only on the server 2026-04-13 22:35:00 +00:00
c0389fa4b1
Stop overwriting the address by ID
We can pull this in the single query we do to the database instead
2026-04-13 22:34:36 +00:00
9c557a0391
Make it possible to save SMS support status on phone record 2026-04-13 22:23:29 +00:00
96878f24de
Get contact information to save in compliance flow 2026-04-13 21:45:29 +00:00
083c4ddae9
Save access information to database 2026-04-13 20:42:03 +00:00
ba76c8b1db
Return full compliance report on PUT 2026-04-13 19:32:22 +00:00
9bca15ae7e
Use the right URI for compliance reports 2026-04-13 19:25:54 +00:00
ffedae0373
Update address foreign key when updating the address 2026-04-13 19:22:41 +00:00
dcab2e1f8f
Fix failing to find matching address with publicreport 2026-04-13 17:19:20 +00:00
92f4282674
Track map loading, frame markers when map is loaded
This prevents missing the marker framing because we are still loading
the map.
2026-04-13 16:59:59 +00:00
8baa056fab
Return a compliance URI on creation 2026-04-13 16:59:34 +00:00
5011f4c137
Add raw address value to public report response 2026-04-13 16:59:20 +00:00
1a031f16bd
Update event types to include compliance reports 2026-04-13 16:59:11 +00:00
5db4c05544
Add proper compliance report type
Can't believe I missed this.
2026-04-13 16:42:29 +00:00
0f94292ab7
Fix zoom in when we load existing report data 2026-04-13 15:22:50 +00:00
b701771dfb
Remove district loading debug log 2026-04-13 15:17:57 +00:00
447ea18d95
Bit of type cleanup when debugging 2026-04-13 15:17:57 +00:00
756cc0d266
Add properties to update compliance permission access 2026-04-13 15:15:33 +00:00
0297114faf
Remove unnecessary null coalesce 2026-04-13 15:14:48 +00:00
dddeafe6cd
Fix query for address IDs 2026-04-13 15:13:59 +00:00
6f5b8f5575
Add implementation of insertAddresses 2026-04-12 19:42:37 +00:00
9ba99d5ceb
Remove now-empty report address fields
We'll instead create address rows and reference those
2026-04-12 18:33:41 +00:00
5306f8ba62
Populate nuisance and water public reports by ID 2026-04-12 18:02:42 +00:00
ae10e4fee8
Initial pattern for populating different report types 2026-04-12 17:53:25 +00:00
c8f74d3c26
Consistently use 'public' prefix for reports 2026-04-12 17:07:14 +00:00
a3c340f787
Split public report URIs by type
This allows us to have different signatures for the different types
2026-04-12 17:01:30 +00:00
875298fe88
Show a warning if they will replace the images on the report 2026-04-12 16:44:20 +00:00
ab47259534
Fix sending empty longitude when creating the initial report 2026-04-12 16:29:37 +00:00
60eb6b9bbf
Use class heirarchy for different report types. 2026-04-10 23:57:47 +00:00
4735734404
Helper functions for parsing geocode data 2026-04-10 22:34:34 +00:00
4060e7ddcd
Upload images on compliance report 2026-04-10 22:34:14 +00:00
730f40956f
Store addresses on every geocode 2026-04-10 22:32:40 +00:00
e04b86218d
Fix bad compliance report PUT
Avoid attempting to PUT the location when we don't have a report URI
2026-04-10 20:30:22 +00:00
12aedaf543
Update the address when provided on a report 2026-04-10 20:30:22 +00:00
bac55774f8
Switch address to contain an embedded location, start saving compliance 2026-04-10 16:59:29 +00:00
14c0d453e9
Add loading indicator when checking for previous report data 2026-04-10 15:38:31 +00:00
b23fc6edc5
Fix dodgy creation of compliance report in database 2026-04-10 15:38:05 +00:00
c48aebcb0b
Set initial camera based on location in compliance 2026-04-10 14:20:04 +00:00
97acdb0e2c
Prevent the lock button from floating over address suggestion 2026-04-10 13:47:00 +00:00
f969f262b8
All spaces at the end of address input
Othewise you can't type, it sucks.
2026-04-10 13:44:58 +00:00
ae50a1abd8
Remove the remains of the old bundle paths 2026-04-10 02:59:55 +00:00
553b65556a
Begin work on saving compliance report 2026-04-10 00:56:51 +00:00
3ad95e1365
Bind contact info to compliance model 2026-04-09 22:48:49 +00:00
86ab67e70b
Bind access information to the compliance model 2026-04-09 22:42:47 +00:00
3bde7a9cac
Save image data on the compliance model 2026-04-09 22:33:45 +00:00
a4a9662c94
Make submit page read from model values 2026-04-09 22:29:42 +00:00
79a56c2d20
Don't show default locator value 2026-04-09 22:29:26 +00:00
d3662b8240
Preserve the locator model
This makes it possible to move back-and-forth in the compliance process
and still retain data.
2026-04-09 22:22:27 +00:00
dbc5db9727
Link up data to final page. 2026-04-09 20:55:30 +00:00
5b5a63114c
Make the permission property an enum 2026-04-09 19:55:08 +00:00
a6912929a7
Establish basic pattern for compliance data flow
We can get location, images, and string data. It's the trifecta.
2026-04-09 17:35:00 +00:00
8d6976c770
Actually go to the next step when we get the location 2026-04-09 17:24:50 +00:00
9dccd21cee
RMO frontend checkpoint
* Create a nwe AddressAndMapLocator which abstracts out the behavior of
   selecting a location
 * Fix the overlay causing render errors on the MapLocator by getting
   rid of the overlay and just using a lock indicator
 * Fix MapLocator zooming in to the wrong place by not framing the
   markers
 * Remove Latlng from platform and just use Location with optional
   accuracy
 * Use nested types with form-encoded POST
 * Fix styles on water report page
2026-04-09 17:21:35 +00:00
cb9e5146bf
Fix display of report ID and status on the by-id page. 2026-04-09 13:48:48 +00:00
882636de8f
Add nuisance report detail to status by ID page 2026-04-09 01:15:13 +00:00
531f3282d9
Move bounds to API types 2026-04-09 01:02:25 +00:00
f88ca57d97
Migrate existing ts types from the API into the API module
This makes it possible to start hydrating the types into valid data
types like Dates which means I can get type safety guarantees when
displaying information.
2026-04-09 00:25:21 +00:00
b2c24a0438
Show nuisance report status 2026-04-08 23:37:00 +00:00
37ce3183ca
Add beginnings of status page 2026-04-08 22:54:20 +00:00
2c0bfb9904
Update nuisance submission to go to submitted page 2026-04-08 17:51:41 +00:00
c41154a200
Actually serialize errors on bad JSON POST response 2026-04-08 17:27:51 +00:00
7f90391ecd
Remove old generated JS/CSS paths
We generate a whole file now.
2026-04-08 15:01:02 +00:00
1a7a2b13aa
Navigate to report complete when report is submitted 2026-04-08 14:59:30 +00:00
6c79b8a85e
Add address GID to public report
This is _way_ better than trying to re-transmit structured address data
to the backend via strings
2026-04-08 14:40:27 +00:00
765b8fbef7
Better overlay logic for clicking on map controls 2026-04-08 14:25:47 +00:00
8e536d1d2f
Add map overlay for phone interactions 2026-04-08 14:11:30 +00:00
68315a3fb2
Fix button icons on complete page 2026-04-07 16:26:34 +00:00
6fb5a7f971
Fix icons on RMO, add phone data to district 2026-04-07 16:25:14 +00:00
cc7ce44f47
Finish porting styles of compliance flow 2026-04-07 16:05:30 +00:00
53bfbbc5ef
comment-out useHead 2026-04-07 16:05:04 +00:00
9601b88b41
Fix breaking the API routing
Ordering matters when you have a catch-all for the SPA
2026-04-07 15:42:48 +00:00
f22ddd0405
Remove attempt at dynamically generating SPA content
We just do it with vite now.
2026-04-07 15:09:59 +00:00
1a53d5338f
Work out the rest of the static site deployment 2026-04-07 14:56:31 +00:00
e7c33d7e10
Get farther in producing a build 2026-04-07 02:54:48 +00:00
d52101d25b
Begin work on nix deployment logic for vite 2026-04-07 02:02:45 +00:00
6f677b5638
Add the full compliance mocks 2026-04-07 00:53:44 +00:00
4faa7fa8c0
Figure out router pattern for compliance steps 2026-04-07 00:04:40 +00:00
20614acb86
Add initial compliance intro page 2026-04-06 22:38:17 +00:00
c393f6fd81
Add cache for all stadia requests 2026-04-06 22:36:25 +00:00
9ef6aaa406
Remove direct calls to stadia API from geocoding 2026-04-06 16:59:19 +00:00
43dce16fbd
Add APIs for geocoding and reverse-geocoding 2026-04-06 16:59:18 +00:00
437f87013a
Add information on resetting password 2026-04-06 15:58:04 +00:00
380b41f695
Add ability to query for a place by gid(s) 2026-04-06 15:36:17 +00:00
2d5dca3fb5
Add proxied autocomplete for Stadia
This allows me to make the format consistent and to cache the
intermediate results, which is useful for speed and testing
2026-04-05 21:57:30 +00:00
b6cfbee102
Add geocoding logic/store 2026-04-05 03:47:22 +00:00
5681ff2283
Fix render crash 2026-04-05 03:09:10 +00:00
332e64c9ab
Add basic location store for getting geoposition 2026-04-04 02:32:09 +00:00
beb6d9d066
Better zoom to location on address selection 2026-04-03 23:11:39 +00:00
e56e83161b
Include address information on nuisance form upload 2026-04-03 23:04:04 +00:00
e08f614d11
Make the locator model a camera, not just a location
That means we can track zoom
2026-04-03 22:42:50 +00:00
10e368c403
Get initial nuisance and water resources working
This is a straight port of the form-encoded POST submission logic.

It is missing a bunch of data.
2026-04-03 22:04:22 +00:00
597aedc2af
Fix show more questions behavior 2026-04-03 20:46:02 +00:00
07e48aa071
Zoom when an address is provided or the map is clicked 2026-04-03 20:29:30 +00:00
c5c78a2b84
Add initial ImageUpload component 2026-04-03 20:15:02 +00:00
9104e2f7c3
Start map with default framing on nuisance page 2026-04-03 20:01:23 +00:00
27fd1faa9c
Get clean-building locator map 2026-04-03 19:45:12 +00:00
6203e3da75
Add nuisance style, fix header on non-home district pages 2026-04-03 19:07:20 +00:00
b6037d7525
Add address suggestion component 2026-04-03 19:02:20 +00:00
51fe851c5a
Add district-styled pages for all 3 main RMO pages 2026-04-03 18:50:23 +00:00
c0e414bdc3
Add status page to RMO 2026-04-03 18:29:29 +00:00
5842b6251d
Add district header on root page 2026-04-03 18:28:41 +00:00
bfecae7d61
Add district resource and an API to RMO
We're going to need an API for the single-page frontend
2026-04-03 18:17:19 +00:00
4f9617aa2f
Add RMO water page, start district layout 2026-04-03 16:37:09 +00:00
fd7607f5b7
Add nuisance view for RMO 2026-04-03 16:08:57 +00:00
64a8de7a32
Add custom icons to RMO system 2026-04-03 16:00:50 +00:00
4a9d6e0db6
Port root RMO with style to main page 2026-04-03 15:58:50 +00:00
4d718f9a12
Add router and basic home view 2026-04-03 15:43:44 +00:00
2342a99405
Save experiment in postgred integration 2026-04-03 15:25:15 +00:00
4f6369fa27
Fix static content for RMO 2026-04-03 15:24:19 +00:00
e8db3de122
Remove old index page 2026-04-03 15:19:54 +00:00
b658e28f2e
Get static content showing on sync 2026-04-03 15:15:47 +00:00
b919472f42
Make RMO and Sync run in parallel and use the same sources 2026-04-03 15:09:53 +00:00
d7d6888f63
Initial commit of some work creating RMO single-page app
Doesn't work yet, but they both start, so checkpoint.
2026-04-03 15:02:37 +00:00
54e77f72f4
Add completion page 2026-04-03 14:34:46 +00:00
3e0003095b
Add process and submission mocks 2026-04-03 03:57:43 +00:00
095ab828b6
Add contact page in compliance flow 2026-04-03 03:43:55 +00:00
06345099eb
Add permission mock in compliance flow 2026-04-03 03:41:26 +00:00
5cabea8577
Add evidence mock 2026-04-03 03:35:45 +00:00
42bcdb8af8
Add mock for concern page 2026-04-03 03:26:52 +00:00
bfe2b88622
Add compliance address mock 2026-04-03 03:22:09 +00:00
377683c4e3
Update compliance landing page
Show district logo, phone number, etc.
2026-04-03 00:18:15 +00:00
457f123f69
Add simple compliance landing page 2026-04-03 00:12:52 +00:00
4b87c74f41
Make impersonation ending work, fix frontend events 2026-04-02 21:31:31 +00:00
522c5785a2
Create button for ending impersonation 2026-04-02 19:36:49 +00:00
76c395d613
Add display in sidebar for impersonation 2026-04-02 17:39:16 +00:00
51811132a4
Add avatar display to user selector 2026-04-02 15:39:52 +00:00
ea231fb0cc
Avoid failure when cloning proxied object 2026-04-02 15:32:48 +00:00
9574ed4812
Fix warning on user edit component 2026-04-02 15:26:03 +00:00
af2299f417
Fix saving of tags on users 2026-04-02 15:25:51 +00:00
945b482b00
Fix saving of tags on users 2026-04-02 15:10:51 +00:00
f9934095b3
Fix reference to avatar URI 2026-04-02 15:09:59 +00:00
aa02d2e729
Fix being able to set the role 2026-04-02 14:30:07 +00:00
ee76dddf2f
Add some missing files from previous commits 2026-04-02 14:23:16 +00:00
fc56c1406a
Make it possible to change more user fields 2026-04-02 14:22:45 +00:00
7ee70b24ee
Fix user data displays 2026-04-02 14:03:07 +00:00
3745231f51
Structure PUT by using omit.Value 2026-04-02 13:28:18 +00:00
353a3ea442
Use the correct scheme for URIs 2026-04-02 01:18:25 +00:00
124d1b7078
Show the avatar on the user edit page 2026-04-02 01:11:51 +00:00
42d111aac9
Add separate session endpoint for additional non-user data
This is conceptually much cleaner that encumbering the user object.
2026-04-02 01:07:55 +00:00
00ebc27069
Add reverse parsing of a URI.
Yay. I did it. All the work is worth it now.
2026-04-01 22:01:31 +00:00
4145944b1b
Allow arbitrary responses from form-encoded POST
Useful for returning full objects
2026-04-01 21:34:17 +00:00
a89a4fbec5
Add avatar resource 2026-04-01 21:23:28 +00:00
0a7a2512d4
Properly set Avatar value to null 2026-04-01 20:35:00 +00:00
6fbde6389d
Start creating user resources without ID. 2026-04-01 20:22:15 +00:00
a656d45a6d
Move QueryParams to resource module 2026-04-01 18:23:43 +00:00
ab519020fc
Swap out the rest of chi
We're now chi-free.

Not bug-free.
2026-04-01 16:57:33 +00:00
6c311c76e3
Initial draft of shifting from chi to gorilla/mux 2026-04-01 16:19:11 +00:00
5172400803
Begin switch to gorilla/mux
I'm realizing with this code that I'm going to have a problem if I want
to do HATEOAS-style APIs. chi just doesn't do resource-oriented API
design, and I'd have to build a lot of stuff myself.

I'm in the middle of swapping out the UI. Now is the time to make the
switch.
2026-04-01 15:32:27 +00:00
c253e655b1
Add avatar placeholer when avatar is empty 2026-04-01 14:48:31 +00:00
0ecf9c1be1
Populate user selector 2026-03-31 23:34:03 +00:00
05a7bbb4e3
Show empty aggregation map without service area 2026-03-31 23:29:37 +00:00
7f8491a1c2
Add test mailer 2 2026-03-31 20:05:35 +00:00
7f72e82ceb
Fix favicon on sync 2026-03-31 19:18:33 +00:00
af136f324d
Break sudo page into components
Makes it easier to fix the overall layout, which I've done.
2026-03-31 17:34:37 +00:00
4f96f35d9a
Add mode 1 mailer for testing. 2026-03-31 17:34:08 +00:00
7b3c1f2b54
Add initial implementation of user selector on sudo 2026-03-31 15:10:32 +00:00
21b7b68f50
Get new frontend to type check clean
Epic undertaking.
2026-03-31 14:52:53 +00:00
6f9a511874
WIP of user avatar work
Switching from laptop
2026-03-29 17:09:01 -07:00
ad90f9c95e
Create API for adding an avatar to a user 2026-03-28 18:55:13 -07:00
da7549eeda
Show actual user data on the edit page. 2026-03-28 18:06:14 -07:00
92ed974e4b
Use overlay buttons to change avatar 2026-03-28 17:23:09 -07:00
15371ec064
Add basic user edit page 2026-03-28 16:31:29 -07:00
4bfaaa72ce
Add URI to user resource 2026-03-28 15:13:11 -07:00
e59794f5e0
Query for users to populate the users page 2026-03-28 14:45:49 -07:00
1f9f1ae166
Fix router links on configuration page 2026-03-28 13:02:04 -07:00
9a9371301c
Get review detail UI to show without crashing
It doesn't fully work yet though.
2026-03-28 12:35:12 -07:00
9921618c12
Get to where we can display something on pool review 2026-03-28 09:14:09 -07:00
33399b5e2a
Fix links on review page 2026-03-28 06:50:25 -07:00
da14410fc7
Update sidebar links to new format 2026-03-28 06:47:20 -07:00
6f8c012394
fix filename and status displays 2026-03-28 06:45:03 -07:00
daf921accf
Fix counts of upload rows 2026-03-28 06:42:36 -07:00
c67afa7e1e
Fix link to upload detail page 2026-03-28 06:42:17 -07:00
2e0f657585
Fix upload list 2026-03-28 06:38:23 -07:00
7699e58bc3
Make upload by ID styles work correctly 2026-03-27 14:17:24 -07:00
4bbfbdb9e6
Pretty all the things I missed
My laptop didn't have lefthook running. Oops.
2026-03-27 14:06:50 -07:00
f60bde7fd9
Get rows to show on individual upload page. 2026-03-27 14:04:33 -07:00
1ad3c5a5c8
On upload redirect to upload detail page 2026-03-27 11:33:21 -07:00
747544bb58
Get file upload working
Even though the UI doesn't do anything with it yet.
2026-03-27 08:39:38 -07:00
0d1bd752a4
Fix flyover data upload link 2026-03-27 06:42:50 -07:00
88d88ec8d3
Fix link to upload pool 2026-03-27 06:40:43 -07:00
670310de15
Add upload style 2026-03-27 06:38:53 -07:00
df8cab4b07
Add style for configuration page 2026-03-27 06:36:41 -07:00
aee9bb9267
All a click in an unselected item to immediately select 2026-03-27 06:25:43 -07:00
d7c07fc65f
Move all POST endpoints to the API 2026-03-27 06:08:55 -07:00
3ff7ff05ab
Remove clear selection button 2026-03-25 21:54:06 -07:00
bf2a7582fa
Get some planning buttons wired up 2026-03-25 21:46:23 -07:00
ef412b28ec
Make upload GET an API request 2026-03-25 21:46:23 -07:00
2a92420bbe
Add flyover upload page 2026-03-24 13:50:44 -07:00
ba89b2e994
Add flyover pool upload 2026-03-24 13:48:13 -07:00
0718d88f7a
Add configuration upload page 2026-03-24 13:45:37 -07:00
a64df8a687
Fix report ID, get to where organization ID is passed through correctly 2026-03-24 11:07:48 -07:00
f33020e2b8
Add flyover card 2026-03-24 09:51:05 -07:00
0318b332bb
Fix viewing photo details 2026-03-24 09:50:15 -07:00
09ae9d0ce3
Move map interfaces to common types for sharing 2026-03-24 09:37:05 -07:00
6fe107601e
Handle address data more gracefully
Helps avoid having embarrassments at the conference
2026-03-24 09:06:42 -07:00
69eabe4e85
Use publicreport card component on planning page 2026-03-24 09:06:42 -07:00
0289bf5756
Fix retrieval of reports by ID 2026-03-24 09:06:42 -07:00
6f45325d9d
Separate publicreport display into common UI component 2026-03-24 09:06:42 -07:00
d5cf65f4cb
Add displays for public reports on the planning page 2026-03-24 09:06:42 -07:00
fb853a2bd3
Add ability to select items and display in detail view 2026-03-24 09:06:42 -07:00
e0a586b311
Clean up display of planning signal entries 2026-03-24 09:06:42 -07:00
55c8b8f1dd
Show a decent title for signals 2026-03-24 09:06:42 -07:00
8594723a0d
Clear the error message when user requests ar refresh 2026-03-24 09:06:42 -07:00
761af13270
coalesce to a valid country value 2026-03-24 09:06:42 -07:00
7f756ce8ca
Make refresh button on planning work
And experiment with separating the list entries into a separate
component
2026-03-24 09:06:42 -07:00
21a8f3029e
Remove on() from proxied arcgis map, catch errors 2026-03-24 09:06:42 -07:00
4f92fdced8
Show an error if the map fails to load 2026-03-24 09:06:42 -07:00
7360a9d2e1
Don't crash if the signal hos no address 2026-03-24 09:06:42 -07:00
b081dcf6d5
Check auth off of our API client 2026-03-24 09:06:42 -07:00
2856587aca
Remove warnings about importing defines that are macros 2026-03-24 09:06:42 -07:00
ea318af65f
Start work on the signin page 2026-03-24 09:06:42 -07:00
da62cc8f98
Add axios
We'll use it to make TypeScript-enabled API requests
2026-03-24 09:06:42 -07:00
4a835d3f16
ignore color function errors
They're from Bootstrap, they don't really matter
2026-03-24 09:06:42 -07:00
84102dd50e
Disable scss deprecation warnings 2026-03-24 09:06:42 -07:00
32f00afa8a
Fix creation of new user organizations 2026-03-24 09:06:42 -07:00
3ae88f984a
Add a .env file
This allows customizations on different machines, like my laptop vs my
desktop.
2026-03-24 09:06:42 -07:00
96237c7599
Allow for disabling OpenAI integration
For offline development
2026-03-24 09:06:42 -07:00
f90faa4732
Set the database when importing districts 2026-03-24 09:06:42 -07:00
a7fe9ee6d9
Add commands for creating the tegola user 2026-03-24 09:06:42 -07:00
9eb7022336
Provide the raw address value for public reports 2026-03-24 05:53:05 +00:00
0a79c5d945
Get dev system running on different ports
So we go Caddy to Vita to nidus-sync
2026-03-23 00:34:21 +00:00
b384252c7c
Add placeholder gen file
Without it we can't compile go.
2026-03-22 22:38:47 +00:00
47f900ab76
Switch from esbuild to vite
It just works better for debugging with VueJS
2026-03-22 22:36:43 +00:00
50643698c2
Try harder to get source maps in Vue
It's not working.
2026-03-22 19:47:04 +00:00
8d5fb1ef0b
Get map markers working on communication page 2026-03-22 19:30:11 +00:00
354c07f2bf
Fix TypeScript errors from recent changes 2026-03-22 18:27:13 +00:00
11f56bfd1c
Allow for desecting communications 2026-03-22 18:25:02 +00:00
a4a8bcdaa9
Add ArcGIS oauth refresh page 2026-03-22 18:00:30 +00:00
35fc57e8d1
Add ArcGIS integration page 2026-03-22 17:49:59 +00:00
a410bf441c
Add integration configuration 2026-03-22 17:44:59 +00:00
71ffa13167
Add organization and pesticide configuration pages 2026-03-22 17:40:40 +00:00
bcde604690
Add pesticide configuration page 2026-03-22 17:25:11 +00:00
29d98796fb
Add common formatting functions 2026-03-22 17:21:01 +00:00
de0fbd7188
Add more configuration pages 2026-03-22 17:20:46 +00:00
da8074d1e0
Make selectedSignals a set, add some emitted actions 2026-03-22 17:19:58 +00:00
b671133f88
Move more stuff to root rendering 2026-03-22 17:19:05 +00:00
6797dfa251
Harmonize style between comms and planning panes 2026-03-22 10:14:48 +00:00
b152cf9c36
Break apart the planning columns 2026-03-22 09:58:25 +00:00
0b8bea393e
Fix updates to notification counts 2026-03-22 08:04:28 +00:00
674801c8b2
Fix subscription in the store
We are back to having instant data
2026-03-22 07:57:55 +00:00
6cd5821e5f
Fix hiding the image modal 2026-03-22 07:20:11 +00:00
d4165ec2d0
Fix display of image modal, start work on fixing marking things 2026-03-22 07:16:42 +00:00
82ecf0f5d1
Add URL for sending message to the list of URLs we give out 2026-03-22 07:06:50 +00:00
9c56f148e4
Fix a bunch of styles on communications page 2026-03-22 06:40:31 +00:00
b68332afc0
Rip apart communications page into separate columns
I broke a bunch of stuff, but it'll be worth it, promise.
2026-03-22 06:36:01 +00:00
22c2df11f8
Fix the ability to mark communications signal/noise 2026-03-22 04:53:50 +00:00
978c20d72a
Fix some of the RMO pages I broke by removing SVGs 2026-03-22 04:53:23 +00:00
bacfe7218f
Fix address display 2026-03-22 04:27:49 +00:00
c73a1123d2
Actually do more stuff when we select a communication 2026-03-22 04:26:53 +00:00
7dd61a06e2
Get to where I can select communications and see them 2026-03-22 04:08:16 +00:00
5f54cfa6ed
Get a callback to fire on click. 2026-03-22 03:56:52 +00:00
ac6cd878af
Get to where the comms page at least loads
Still got some warnings, still lots broken
2026-03-22 03:33:52 +00:00
821647cef1
Actually fetch communication from the store 2026-03-22 03:03:21 +00:00
03301518f0
TypeScript checking is clean.
Tons and tons of broken functionality. Now the crawl begins.
2026-03-22 02:55:17 +00:00
d9a98e9eb2
Begin ripping apart the communications page into components
Essential to get the logic under control
2026-03-22 02:37:10 +00:00
ef552af054
remove Alpine and start fixing type errors 2026-03-22 02:36:57 +00:00
46edbbae74
Add Communication API to user URLs
We don't want to build URLs anywhere but in the server.
2026-03-22 01:33:14 +00:00
31a9490210
Get required data for communications page from user store
Which gets it from the API of course
2026-03-22 01:23:08 +00:00
21180816be
Start providing organization info and URLs is user/self
The new frontend needs it to do its work.
2026-03-22 01:22:44 +00:00
736c71eefc
Start adding other views and our initial stores 2026-03-22 00:55:48 +00:00
c75c5446f7
Add barely-compiling views for the rest of the sidebar
No way these things actually work.
2026-03-22 00:22:16 +00:00
6422609150
Set up dashboard page through VueJS 2026-03-21 23:44:14 +00:00
bf3204992e
Enable sourcemaps for debugging 2026-03-21 23:24:06 +00:00
6d6fe9e1d6
Move Intelligence file to Vue logic 2026-03-21 22:41:47 +00:00
eaeedd5356
Use common navigation code between sidebar links 2026-03-21 22:18:01 +00:00
34d14846a1
Fix main content window to render correctly with sidebar 2026-03-21 21:59:44 +00:00
d367166e77
Add vue-router for handling routing to components 2026-03-21 21:58:02 +00:00
dba1468e4d
Improve build watch plugin
Makes it much easier to see what's going on.
2026-03-21 21:44:10 +00:00
e5af41b703
Re-create dynamic nature of the sidebar 2026-03-21 21:35:32 +00:00
48d44487da
Fill out the rest of the sidebar's icons 2026-03-21 21:31:30 +00:00
1bd0adbc50
Move SVGs into the frontend build pipeline
That way it can be used in the VueJS frontend directly
2026-03-21 21:27:50 +00:00
9b8c079d79
Start sorting out basic layout elements 2026-03-21 21:06:10 +00:00
efece7733f
Migrate root of application to use a basic Vue app
We'll build from here.
2026-03-21 20:48:21 +00:00
5779242f22
Prettier everything, remove vendored bootstrap
These are installed now via pnpm
2026-03-21 19:41:51 +00:00
004a49c4e4
Update prettier to format the new file types. 2026-03-21 19:39:30 +00:00
80f4f51b02
Add bootstrap-icons, make sidebar work with bundle logic
I'm starting to get a sense how to do all of this with these new tools.
I've semi-ported the sidebar at this point.
2026-03-21 19:34:23 +00:00
f3c818a48f
Add CSS via SCSS to the frontend build pipeline 2026-03-21 19:14:51 +00:00
1e67c0090d
Show how to add a map view through typescript 2026-03-21 18:13:40 +00:00
0126d24242
Switch to using single-file components (SFC) in Vue 2026-03-21 17:51:25 +00:00
ccdb391ccc
Get VueJS working in a sample project 2026-03-21 17:45:36 +00:00
228f4a6db9
Don't group js files with images in cache contral
That's because we want the bundle to be super-cached and immutable.
2026-03-21 15:25:18 +00:00
5d8314d13b
Ignore different temp directory 2026-03-21 15:10:28 +00:00
303b4b826b
Fix export of SSEManager for SSE connection 2026-03-21 15:08:37 +00:00
cee76ddd53
Add bootstrap to the main application bundle 2026-03-21 05:47:39 +00:00
a2c3f52ab4
Fix embedded static files on production builds 2026-03-21 05:38:42 +00:00
9cbce4ff14
Fix nix build
Apparently tabs are bad.
2026-03-21 04:44:13 +00:00
0d6a6fa797
Include bootstrap and bootstrap icons in a single style bundle 2026-03-21 03:33:11 +00:00
16499fe23e
Don't error out just because we don't have a main map. 2026-03-21 03:11:22 +00:00
31947c848a
Move static outside HTML. Start work on TypeScript bundle
It's not strictly HTML, so that's just correct.

This is just worth doing while building the new TypeScript bundle
2026-03-21 03:06:59 +00:00
976a29b7d7
Create a working sample of an AlpineJS hello world 2026-03-21 02:04:11 +00:00
701f4853b5
Create a tiny working TypeScript example page 2026-03-21 01:42:22 +00:00
9b6cacda0e
Make signals include the object they are attached to (pool, report)
This means pushing the types into the common types module, which
required a refactor of a bunch of other libraries.
2026-03-21 01:19:36 +00:00
ddc63bfa91
Show pool map in planning workbench when signal is selected 2026-03-20 22:47:03 +00:00
931ea00e22
Add entry for displaying flyover pool signal 2026-03-20 22:12:53 +00:00
2cdcbb3784
When pool are green or murky, immediately create signal from them. 2026-03-20 22:11:36 +00:00
c2c1f3377a
Remove signal_pool from tegola grant
That table is no more.
2026-03-20 21:16:42 +00:00
c4359a3c81
Fix signals getting saved with correct location 2026-03-20 20:37:16 +00:00
e86cdc6764
Fix status display for RMO 2026-03-20 20:25:58 +00:00
b034fa5cf5
Fix signal lines showing the correct type 2026-03-20 19:27:56 +00:00
8c6bb7db26
Show markers on the signal map and bound them 2026-03-20 19:20:11 +00:00
94400aa808
Remove the arcgis tile map from planning/signal selection 2026-03-20 19:11:02 +00:00
23fdfc5a98
Add comments and owner info from water reports 2026-03-20 19:07:10 +00:00
2f8f579430
Fix display of water access and breeding data 2026-03-20 18:54:37 +00:00
c392029a11
Fix alpine access errors
Turns out I misunderstood how x-data and x-if work together
2026-03-20 18:52:21 +00:00
bf5c49378b
Show reporter ownership information for water reports 2026-03-20 18:38:57 +00:00
557caef8e5
Fix not null error on nuisance reports 2026-03-20 18:30:02 +00:00
067465ab35
Expect only one format from our API
What are we, LLMs?
2026-03-20 18:29:29 +00:00
3c26ebdaf2
Shorten references to nuisance data 2026-03-20 18:13:43 +00:00
daf5aa316f
Add display for nuisance source data 2026-03-20 18:09:27 +00:00
aa94cce2ad
Fix creation of signal from a communication report
I had broken this when altering the signal model to always require a
location
2026-03-20 18:03:32 +00:00
f09533c742
Auto-reload signal data when a new signal is created 2026-03-20 18:03:05 +00:00
e88f40793f
Add duration information for nuisance reports 2026-03-20 18:01:52 +00:00
76bfc09aa5
fix display of nuisance properties 2026-03-20 17:59:40 +00:00
edfd8e285f
Add location x and y to address table
For easier reference
2026-03-20 17:59:13 +00:00
441e4d45b1
Add parcel overlay to raster tile map
Makes it easier to tell what parcel we're talking about.
2026-03-20 17:07:31 +00:00
313dacd956
Remove chatty debug logs 2026-03-20 16:38:01 +00:00
9d2b757bc7
Add pools with condition popup to review map 2026-03-20 16:37:46 +00:00
c9802b78d0
Fix double-showing of distance 2026-03-20 15:50:18 +00:00
6fcaf7fb5d
Avoid crashing when oauth is null 2026-03-20 15:47:45 +00:00
9ca8ec4ce2
Handle null image location in communication page 2026-03-20 15:45:55 +00:00
29e66327ee
Stop adding users to organizations based on Arcgis Account 2026-03-20 06:04:30 +00:00
a87904f2ff
Handle photo data including NaN for location 2026-03-20 05:48:59 +00:00
42d9d2372d
Add initial user selector for impersonation page 2026-03-20 05:20:37 +00:00
68e0da1133
Add log message when we can't marshal JSON
Been seeing this in prod
2026-03-20 05:01:52 +00:00
cb34c43ef4
Improve error messages on notify failures 2026-03-19 21:29:55 +00:00
6042e7d337
Emit events on note creation 2026-03-19 21:29:55 +00:00
31a767c944
Improve capture of shutdown error 2026-03-19 21:20:20 +00:00
87961bac58
Move audio API to its own file
More consistent organization
2026-03-19 21:07:00 +00:00
c7c1c45008
Add location to signal 2026-03-19 20:49:53 +00:00
fdab54a775
Fix saving note images and transcoding 2026-03-19 20:49:17 +00:00
ba03bf9d4f
Fix audio transcode copy-paste error 2026-03-19 20:14:23 +00:00
17fb3dcdb5
Fix saving notes from Nidus
Wow, that's a serious break.
2026-03-19 20:13:53 +00:00
f2b7d30a7f
Unselect report after it's removed from the list 2026-03-19 19:32:06 +00:00
429b724cf2
Emit a created event on signal creation 2026-03-19 19:17:00 +00:00
2c4e7c4f96
Handle nuisance reports without location data 2026-03-19 19:16:39 +00:00
7a111ab9b3
Show a notification when a report is marked as a signal 2026-03-19 19:07:48 +00:00
2f1b612e9e
Move signal creation inside platform layer
This allows us to emit events with it.
2026-03-19 19:00:44 +00:00
a5b8a333d6
Fix references to service area centroid in map creation 2026-03-19 18:08:41 +00:00
908ac4faea
Make signals, not leads, from public reports. 2026-03-19 17:41:56 +00:00
2a207fd613
Fix updating notification counts on events 2026-03-19 17:22:58 +00:00
ee61b6d24b
Move review actions into the platform, emit events on change
Still not seeing updates in the sidebar, however.
2026-03-19 16:55:49 +00:00
954a4330ee
Add notifications for review tasks 2026-03-19 16:01:44 +00:00
786a6c16a3
Fix up upload by ID
Show the street number as well as the rest of the address, emit an event
when the upload is processed, actually check if pools are existing, etc.
2026-03-19 15:31:04 +00:00
97c9269215
Update file status to committed when commit completes 2026-03-19 05:53:43 +00:00
0cc0b57e33
Fix display of 'committing' files 2026-03-19 05:46:06 +00:00
dad867a356
Fix readme example use of watchexec 2026-03-19 05:45:54 +00:00
ab5840dd54
Fix references to org ID using platform org
I broke these a while ago and didn't realize because the compiler
doesn't catch them.
2026-03-19 03:57:38 +00:00
5fd85d7052
Show log when upload file is committed 2026-03-19 03:53:15 +00:00
544f99c09b
Check error state on update sql 2026-03-19 03:52:56 +00:00
6338d9f3f3
Remove chatty debug log 2026-03-19 03:52:40 +00:00
45643e8369
fix redundant log message 2026-03-19 03:52:28 +00:00
c872cebb8f
Make pool condition colors more distinctive 2026-03-19 03:42:30 +00:00
5fa4dd2884
Fix error about redundant service area 2026-03-19 03:42:14 +00:00
c039c70e3e
Switch file upload page to not use map-libre-test
That libre test was something I built when doing the changeover to
stadia maps. It's now pretty well baked, so it's better to just use it.
2026-03-19 03:30:01 +00:00
434746aa99
Allow the catch-all district to do uploads 2026-03-19 03:25:36 +00:00
2f61b224de
Prevent creating CSV uploads without a service area 2026-03-19 03:19:58 +00:00
f2ea1367e2
Allow the transaction to commit on failure in CSV processing 2026-03-19 03:19:17 +00:00
d287fa44df
Create a log for impersonation activities 2026-03-19 03:19:03 +00:00
b2eb98a66c
Fix upload list page 2026-03-19 02:51:09 +00:00
732b123342
Auto-generate report IDs, join on public_id. 2026-03-18 20:36:58 +00:00
f66d40f28b
fix bad select in migration 112 2026-03-18 20:23:57 +00:00
1001 changed files with 69493 additions and 35751 deletions

22
.gitignore vendored
View file

@ -1,14 +1,26 @@
.env
.sass-cache/
cmd/geocode-test/geocode-test
cmd/passwordgen/passwordgen
/db/jet/jet
districts/
flogo.log
html/static/css/bootstrap.css
html/static/css/bootstrap.css.map
nidus-sync
nidus-sync.log
lob/cmd/letter-create/letter-create
lob/cmd/letter-list/letter-list
lob/cmd/address-create/address-create
lob/cmd/address-list/address-list
/nidus-sync
/nidus-sync.log
node_modules/
postgrid/cmd/send-pdf/send-pdf
result
stadia/cmd/bulk-geocode/bulk-geocode
stadia/cmd/geocode-autocomplete/geocode-autocomplete
stadia/cmd/geocode-bygid/geocode-bygid
stadia/cmd/reverse-geocode/reverse-geocode
stadia/cmd/structured-geocode/structured-geocode
tmp/
stadia/cmd/tile-raster/tile-raster
static/gen/
temp/
ts/gen
vite/*/.vite/

View file

@ -1,17 +1,11 @@
{
"plugins": ["/nix/store/6kfm5qrd2bckffxphb5ylvbg3sz1657r-prettier-plugin-go-template-0.0.15-unstable-2023-07-26/lib/node_modules/prettier-plugin-go-template/lib/index.js"],
"useTabs": true,
"overrides": [
{
"files": ["*.html"],
"options": {
"parser": "go-template",
"useTabs": true,
},
},
{
"files": ["*.js"],
"options": {
"useTabs": true,
},
},
],

303
CLEANUP.md Normal file
View file

@ -0,0 +1,303 @@
# nidus-sync — Cleanup Tasks
This file lists code, files, and patterns that are remnants of older architectural approaches. These should be removed to reduce complexity, maintenance burden, and confusion.
---
## 1. Bob → Jet Migration (Incomplete)
**Status:** Bob is still the primary ORM. Jet was introduced May 2026 but only covers 3 schemas partially.
### 1a. Port remaining schemas from Bob to Jet
Jet-based queries exist for:
- `db/query/public/` — address, communication, communication_log_entry, compliance_report_request, feature, feature_pool, job, lead, signal, site
- `db/query/publicreport/` — compliance, image, image_exif, nuisance, report, report_image, report_log, water
- `db/query/arcgis/` — account, oauth, service_feature, service_map, user, user_privileges
Still using Bob directly (not yet ported to Jet queries):
- `platform/report/notification.go` (13 bob references)
- `platform/background/background.go` (8)
- `platform/arcgis.go` (8)
- `platform/text/send.go` (7)
- `platform/report/some_report.go` (6)
- `platform/site.go` (5)
- `platform/csv/flyover.go` (7)
- `platform/csv/pool.go` (5)
- `platform/csv/csv.go` (4)
- `platform/text/report.go` (4)
- `platform/text/phone_number.go` (3)
- `platform/publicreport/log.go` (3)
- `platform/mailer.go` (3)
- `platform/email/template.go` (2)
- `db/connection.go` (4 — bob.Tx types)
- `db/prepared.go` (2)
- `resource/review_task.go` (2)
- `rmo/status.go` (2)
- `rmo/report.go` (1)
- `rmo/mailer.go` (1)
- Plus many api/* files
### 1b. Remove Bob-generated models after migration
Once all queries are ported to Jet, delete the 103 `.bob.go` files in `db/models/`:
```
db/models/*.bob.go
```
### 1c. Remove Bob-specific helper files
These are Bob-specific and can be removed once Bob is fully replaced:
- `db/dberrors/` — Bob error types (still referenced)
- `db/dbinfo/` — Bob type info (still referenced)
- `db/models/bob_loaders.bob.go`
- `db/models/bob_where.bob.go`
### 1d. Remove Bob from go.mod and dependencies
After all Bob code is gone:
- Remove `github.com/Gleipnir-Technology/bob` from `go.mod`
- Run `go mod tidy`
### 1e. Remove Bob codegen scripts
- `db/bobgen.sh`
- `db/bobgen.yaml`
### 1f. Regenerate Jet output
The `db/jet/main.go` generator outputs to `db/gen/` but no output is currently checked in. Run the generator and ensure generated code is usable:
```bash
cd db/jet && go run .
```
---
## 2. Go HTML Templates → Vue SPA (Mostly Complete)
**Status:** Nearly all Go template routes are commented out in `sync/routes.go` and `rmo/routes.go`. Both hosts serve Vue SPAs via `static.SinglePageApp()`. Some Go template routes remain active.
### 2a. Remaining active Go template routes (sync)
These routes in `sync/routes.go` still render Go templates:
- `/oauth/arcgis/begin``getArcgisOauthBegin` (redirect, no template but in Go)
- `/oauth/arcgis/callback``getArcgisOauthCallback`
- `/mailer/pool/random``getMailerPoolRandom`
- `/mailer/mode-1``getMailer1` (generates PDF)
- `/mailer/mode-2``getMailer2` (generates PDF)
- `/mailer/mode-3/{code}``getMailer3` (generates PDF)
- `/mailer/mode-1/preview``getMailer1Preview`
- `/mailer/mode-2/preview``getMailer2Preview`
- `/mailer/mode-3/{code}/preview``getMailer3Preview`
- `/privacy``getPrivacy`
The mailer routes use `platform/pdf` which in turn uses headless Chrome (`chromedp`) to render HTML to PDF. This is legitimate server-side functionality, not just a template remnant. However, the PDF templates themselves may be candidates for migration to the Vue ecosystem.
### 2b. Remove all commented-out routes
Both `sync/routes.go` and `rmo/routes.go` have large blocks of commented-out route registrations. Remove these once migration is confirmed complete.
### 2c. Remove unused Go template files
Once all routes are ported or confirmed dead, remove the entire `html/template/` directory. The `html/` package (`html/embed.go`, `html/filesystem.go`, `html/func.go`, etc.) should also be removed once nothing references it.
### 2d. Reduce the html/ package surface
**Note:** The `html/` package is still actively imported by 40+ Go files. It provides:
- Template rendering (`html/embed.go`, `html/filesystem.go`) — mostly for mailer PDFs and privacy page
- `html.ContentConfig` — used extensively in sync/routes (mailer previews, admin pages)
- `html.MakeGet`, `html.MakePost` — HTTP handler wrappers (used by active `sync/` routes)
- `html.RespondError` — HTTP error responses
- Form parsing, image upload handling, URL building
**Short-term:** Remove the template rendering portion once mailer PDFs and privacy page are migrated.
**Long-term:** The full `html/` package can be removed only after all server-rendered pages are gone and handler wrappers are replaced with the `resource/` pattern.
---
## 3. esbuild (`build.js`) — Removed ✅
*(Completed 2026-05-09: `build.js` removed and `pkgs.esbuild` dropped from flake.nix devShell — Vite is the build tool)*
---
## 4. Legacy Static JavaScript Files
**Status:** `static/js/` contains 20 plain JavaScript files written as custom HTML elements and standalone scripts for the Go template era. These are referenced by old Go HTML templates but most of those templates are now unused.
### 4a. Files in static/js/
```
address-display.js
address-or-report-suggestion.js
address-suggestion.js
events.js
geocode.js
location.js
map-admin.js
map-aggregate.js
map-arcgis-tile.js
map-cell.js
map-locator.js
map-locator-ro.js
map-multipoint.js
map-proxied-arcgis-tile.js
map-routing.js
map-service-area.js
photo-upload.js
table-report.js
table-site.js
time-relative.js
user-selector.js
```
### 4b. Determine which are still used
The remaining active Go templates (mailer, oauth, privacy) may reference some of these. Check each active template for `<script src="/static/js/...">` references. Templates that are confirmed unused:
- All templates in `html/template/sync/` (dashboard, cell, communication-root, district, intelligence, layout, operations-root, planning-root, radar, review, sudo, upload-*) — these are replaced by Vue SPAs
- Most templates in `html/template/rmo/` — RMO routes are all commented out
### 4c. Migrate any still-needed functionality
The map-locator, address-suggestion, and photo-upload functionality has Vue equivalents in `ts/components/`. The remaining custom element patterns should be fully replaced by Vue components.
---
## 5. TomTom Integration — Removed ✅
*(Completed 2026-05-09: `tomtom/` directory removed — zero imports outside itself, Stadia Maps is now the geocoding/tile provider)*
---
## 6. Postgrid — Alternate Mail Provider
**Status:** `postgrid/` contains a single CLI tool (`cmd/send-pdf`) and a `postgrid` Go package reference in `main.go`. Lob is now the mail provider, with its own integration in `lob/`.
### 6a. Investigate and remove if unused
- Check if Postgrid is actually being used in production vs Lob
- If Lob is the chosen provider, remove `postgrid/` entirely
- Remove any Postgrid configuration references
---
## 7. Duplicate Architecture: `api/` vs `resource/`
**Status:** The `api/` package contains both route registration (`api/routes.go`) and handler functions (`api/signin.go`, `api/publicreport.go`, `api/compliance.go`, etc.). The `resource/` package provides typed resource handlers that expose `List`, `Get`, `Create`, etc. Some functionality exists in both layers.
### 7a. Consolidate handler functions
Functions in `api/` that directly handle business logic should be moved to `resource/`:
- `api/signin.go``postSignin`, `postSignout`, `postSignup`
- `api/compliance.go` — various compliance handlers
- `api/publicreport.go``postPublicreportInvalid`, `postPublicreportSignal`, `postPublicreportMessage`
- `api/sudo.go``postSudoEmail`, `postSudoSMS`, `postSudoSSE`
- `api/configuration.go``postConfigurationIntegrationArcgis`
- `api/review.go``postReviewPool`
- `api/twilio.go`, `api/voipms.go` — webhook handlers
- `api/audio.go`, `api/image.go` — media upload handlers
- `api/tile.go`, `api/debug.go` — utilities
### 7b. Standardize on resource pattern
Either move everything to `resource/` or keep both but clearly define responsibilities:
- `resource/` — domain resource CRUD + URI generation
- `api/` — route registration + HTTP concerns only
Currently the split is unclear and some `api/` files do substantial business logic.
---
## 8. `arcgis-go` Submodule — Not Checked Out
**Status:** The `arcgis-go` submodule (referenced in `.gitmodules`) is not checked out (empty directory). The external `github.com/Gleipnir-Technology/arcgis-go` package is used via `go.mod` instead.
### 8a. Remove submodule
```bash
git submodule deinit arcgis-go
git rm arcgis-go
```
Verify that all code references use the external package, not a local path.
---
## 9. `go-geojson2h3` Local Copy
**Status:** `go-geojson2h3/` is also a submodule. The external package `github.com/Gleipnir-Technology/go-geojson2h3/v2` is imported in `go.mod`. Only `h3utils/h3.go` references it.
### 9a. Consolidate
- If the local copy isn't needed, remove the submodule
- If local modifications exist, merge upstream or maintain intentionally with documentation
---
## 10. Old Generated Files & Artifacts
### 10a. `query.go` at project root — Removed ✅
### 10b. `db/sql/` directory
Contains `.bob.go` and `.bob.sql` files — these are Bob-style named queries. Once Bob is removed, these can be cleaned up or migrated to Jet equivalents.
### 10c. `static/gen/main.js`
A leftover built artifact. The new build output goes to `static/gen/sync/` and `static/gen/rmo/` via Vite. Ensure `static/gen/` is in `.gitignore` and the stale `main.js` is removed.
### 10d. `static/css/placeholder`
Empty placeholder file. Remove.
---
## 11. Nix devShell Cleanup
**Status:** `flake.nix` devShell includes several tools from older workflows:
### 11a. Potentially unnecessary devShell packages
- `pkgs.esbuild` — replaced by Vite (keep only if `build.js` is retained)
- `pkgs.dart-sass` — Vue/Vite uses the `sass` npm package; check if Go code invokes dart-sass directly
- `pkgs.autoprefixer` — may not be needed with Vite's built-in PostCSS
---
## 12. Start Scripts — Consolidate
**Status:** Four start scripts exist:
| Script | Purpose |
|--------|---------|
| `start-air.sh` | Development with air (live reload) |
| `start-flogo.sh` | Unknown (references `flogo`) |
| `start-nidus-sync.sh` | Production-like direct run |
| `start-nix-built.sh` | Run Nix-built output |
`start-flogo.sh` may be a remnant. Investigate and remove if unused.
---
## Priority Summary
1. **High impact, low effort:**
- ~~Remove `tomtom/` (unused, no imports)~~
- ~~Remove `build.js` (dead, replaced by Vite)~~
- Remove commented-out routes in `sync/routes.go` and `rmo/routes.go`
- ~~Remove `query.go` commented-out code~~
- Remove `static/gen/main.js` stale artifact
- Remove `static/css/placeholder`
2. **Medium impact, medium effort:**
- Remove unused Go HTML templates (confirm which are still active first)
- Remove unused `static/js/` files (verify against active templates)
- Remove `arcgis-go` submodule
- Clean up Nix devShell
3. **High impact, high effort:**
- Complete Bob → Jet migration across all schemas
- Remove Bob-generated models, helpers, scripts
- Remove Bob from go.mod
- Consolidate `api/` and `resource/` handler patterns
- Remove `html/` package (after all Go templates are gone)

207
HISTORY.md Normal file
View file

@ -0,0 +1,207 @@
# nidus-sync — Project History
## Overview
nidus-sync is a dual-tenant mosquito abatement platform serving two domains:
- **RMO** (`report.mosquitoes.online`) — Public-facing mosquito/water/nuisance reporting
- **Sync** (`sync.nidus.cloud`) — Administrative dashboard for vector control districts
The project was started in November 2025 and has undergone several major architectural shifts across ~1655 commits spanning 6 months.
---
## Timeline
### Phase 1: Foundation (November 2025)
**Nov 3 Nov 13: Project bootstrap**
- Initial Go project with Nix build system (`flake.nix`, `default.nix`)
- Basic `net/http` web serving with `gorilla/mux` routing
- Go `html/template` server-side rendering
- Bob ORM integration (`github.com/Gleipnir-Technology/bob`) for PostgreSQL — code-generated models via `bobgen`
- ArcGIS OAuth integration for user authentication
- ArcGIS Fieldseeker data synchronization (treatment areas, inspections, breeding sources, etc.)
- MapBox GL JS integration for heatmap visualization
- Dashboard with login, basic CRUD mocks
**Nov 13 Nov 24: Logging & DB restructuring**
- Migration from standard `log` to `zerolog` for structured, colorized output
- Database logic moved into a separate `db/` subdirectory
- Clean shutdown logic, token refresh loops
**Key characteristics:** Monolithic Go server, HTML templates, Bob ORM, MapBox maps, ArcGIS OAuth
---
### Phase 2: Fieldseeker & Schema Evolution (December 2025)
**Dec 2 Dec 24: Fieldseeker schema v2**
- Bob codegen updated to latest version
- Fieldseeker schema captured on OAuth connect and stored locally
- Dynamic SQL functions replacing hardcoded per-table sync logic
- Old Fieldseeker tables removed, v2 generated tables used
- Note/image audio support added
- MMS file downloads from SMS webhooks
**Key characteristics:** Bob-generated fieldseeker models, prepared SQL functions, SMS/MMS debugging
---
### Phase 3: Architecture Maturation (January 2026)
**Jan 2 Jan 8: Domain split & template system**
- WIP pass-through models concept ("Checkpoint on initial idea for passing through models")
- Massive reorganization: templates split into `rmo/` (public) and `sync/` (admin) subdirectories
- `html/` package created with embedded template loading
- Bob submodule removed, `arcgis-go` became external dependency
- Public report domain support added
- Version bumped 7 times in rapid iteration (v0.0.4 → v0.0.10)
**Jan 8 Jan 31: Platform Layer emergence**
- "Report platform layer" introduced (`a9b0a55f`) — initial abstraction between HTTP handlers and database
- Address suggestion and map-locator components via custom HTML elements
- SVG auto-transformation into Go templates
- Report submission forms wired up (nuisance, water)
- Email template system
**Key characteristics:** Two-domain architecture (RMO/Sync), `html/` template package, platform layer beginning, custom element web components
---
### Phase 4: Map Migration & Platform Expansion (February 2026)
**Feb 1 Feb 28: Map provider transition**
- MapBox → MapLibre GL (open-source fork) via `maplibre-gl`
- Stadia Maps integration for tile serving and geocoding (Feb 12-14)
- TomTom routing integration added (Feb 17)
- Bulk geocoding via Stadia
- Parcel image generation debugging
**Platform layer expansion:**
- Emails moved to platform layer
- Phone/SMS support
- OAuth integration settings
- Upload platform functions
- QR code and image tile moved into platform
- Admin map components
**Key characteristics:** MapLibre/Stadia replacing MapBox, TomTom added, platform layer expanding, heavy template iteration
---
### Phase 5: VueJS Revolution (March 2026) — 448 commits
**Mar 5 Mar 12: Pre-Vue cleanup**
- Stadia Maps client initialization
- Signal database schema added
- Review task/mailer schema rework
- Generated Bob files pruned
**Mar 12: Massive platform layer rework** (`44c4f17f`)
- User/organization handling restructured in platform layer
- Signal creation moved inside platform
**Mar 18 Mar 22: VueJS Migration** (the biggest architectural shift)
- Mar 18: Auto-generated report IDs
- Mar 21: **VueJS introduced** — begins with TypeScript bundle, then Vue SFC components, vue-router, Bootstrap/SCSS integration
- Mar 21: Dashboard, Intelligence, sidebar all moved to Vue
- Mar 22: **esbuild replaced by Vite** (`47f900ab`) — `vite/` directory with separate configs for `sync` and `rmo` SPAs
- Mar 22: TypeScript checking clean across entire frontend
- Mar 23: Public report card component, auth checks off API client
- Mar 24-31: Communication page ripped into components, impersonation support, users page
**Key characteristics:** VueJS 3 + TypeScript + Vite frontend, Pinia stores, vue-router, SCSS, SPA architecture replacing server-rendered Go templates
---
### Phase 6: Compliance & Communication (April 2026) — 454 commits
**Apr 1 Apr 9: RMO frontend & resources**
- Resource layer expanded (user, avatar, district, nuisance, water, compliance resources)
- RMO frontend checkpoint — Vue ports of public-facing pages
- TS types migrated into API module
- Old bundle paths removed, old SPA generation removed
**Apr 10 Apr 17: Compliance workflow**
- Compliance report creation, mailer flow
- Site/pool review tasks
- Stadia Maps cache, direct tile access
- OAuth refresh in frontend
- Image upload components
**Apr 17 Apr 25: Communication system**
- Background jobs reworked for shorter transactions
- Lob (physical mail) integration — direct API client, address creation, letter events
- QR code generation moved to API
- Compliance report evidence, mailer views
- Vue map system generalized (`cad01e68`)
**Apr 25 Apr 30: Map & communication polish**
- VueJS reimplementation of address/report suggestion
- Communication workbench with map, list, detail views
- Text message log, email/phone display
- Compliance card detail display
- SSE event system with status vs resource message distinction
- Systemd socket activation for downtime-free deploys
- Sentry error tracking for Vue frontend
**Key characteristics:** Compliance/mailer operational, communication system born, Lob integration, Sentry, generalized Vue map system
---
### Phase 7: Jet Migration & Cleanup (May 2026) — 46 commits so far
**May 1 May 9: SQL generation transition**
- **Jet (go-jet/jet) introduced** — type-safe SQL builder replacing Bob's query building
- Custom Jet generator created with geometry/Box2D type support (`db/jet/main.go`)
- `publicreport` schema ported to Jet
- `arcgis` schema ported to Jet (compiles, not fully tested per commit message)
- New `communication` table added
- Communication marking workflow (invalid, pending-response, possible-issue, possible-resolved)
- Linting: `golangci-lint` added to lefthook, per-file linting
- Cleanup of legacy generated columns (latitude/longitude), string-based queries
- Centralized error handler for Vue sync app
**Key characteristics:** Bob→Jet transition in progress, communication workflow, code quality improvements
---
## Architectural Patterns (by layer)
### Current architecture stack
```
┌─────────────────────────────────────────────────┐
│ Vue 3 SPA (TypeScript) │
│ ts/ — shared components, composables, stores │
│ vite/sync/ — admin SPA entry │
│ vite/rmo/ — public SPA entry │
├─────────────────────────────────────────────────┤
│ Go HTTP Server (gorilla/mux) │
│ api/routes.go — central route registration │
│ resource/ — resource handlers (REST patterns) │
│ sync/ — remaining Go template routes │
│ rmo/ — remaining Go template routes │
├─────────────────────────────────────────────────┤
│ platform/ — business logic layer │
│ (address, compliance, communication, district, │
│ email, fieldseeker, mailer, publicreport, │
│ review, signal, text, user, upload, etc.) │
├─────────────────────────────────────────────────┤
│ db/ — database access │
│ db/models/ — Bob-generated models (103 files) │
│ db/query/ — Jet-based query functions │
│ db/prepared.go — prepared SQL functions │
├─────────────────────────────────────────────────┤
│ PostgreSQL │
└─────────────────────────────────────────────────┘
```
### Pattern: Platform Layer
Introduced January 2026, the `platform/` package encapsulates business logic between HTTP handlers and the database. It grew from initial report handling to encompass users, organizations, emails, texts, compliance, communications, signals, geocoding, tiles, uploads, and more.
### Pattern: Resource Layer
Added MarchApril 2026, `resource/` provides typed REST resource handlers with URI generation (via mux route naming). Resources are instantiated with a `resource.NewRouter()` and expose methods like `List`, `Get`, `Create`, `Update`, `Delete` that return domain types. This replaced ad-hoc handler functions in `api/`.
### Pattern: Dual SPA + API
Since late March 2026, both domains serve Vue SPAs for most routes, with the Go server acting as an API backend. The `static.SinglePageApp()` handler serves the Vite-built output and falls back to `index.html` for client-side routing. Some Go template routes remain for mailer PDF generation, OAuth flows, and previews.

View file

@ -2,6 +2,25 @@
This is the software that powers [Nidus Cloud Sync](https://sync.nidus.cloud).
## Administration
### Password resets
If you need to manually reset a password you can do so with:
```
$ nix-shell -p genpass
$ genpass 12
abc123abc123
# this is from nidus, installed on deployment servers at the system layer
$ passwordgen
Please enter your password: abc123abc123
Password: abc123abc123
Hash: $2a$14$hdtoAtP7joczutY3bxaFqemBApH8xc5NbXLvDQqBfdzWV3jGSy4zi
$ psql -d nidus-sync
nidus-sync=> update user set password_hash='$2a$14$hdtoAtP7joczutY3bxaFqemBApH8xc5NbXLvDQqBfdzWV3jGSy4zi' where id=<something>;
```
## Building from source
First, you'll need [Nix](https://nix.dev).
@ -48,7 +67,7 @@ There's a table containing district information in the database, `import.distric
psql
CREATE SCHEMA import;
shp2pgsql -s 3857 -c -D -I CA_districts.shp import.district | psql -d nidus-sync
psql
psql -d nidus-sync
ALTER TABLE import.district ADD COLUMN geom_4326 geometry(MultiPolygon,4326) GENERATED ALWAYS AS (ST_Transform(geom, 4326)) STORED;
```
@ -84,10 +103,38 @@ This uses [goose](https://github.com/pressly/goose). You can use the goose comma
> GOOSE_DRIVER=postgres GOOSE_DBSTRING="dbname=nidus-sync sslmode=disable" goose up
```
### svg icons
These icons are generated as part of the build system. You can generate them manually with:
```
pnpm generate-icons
```
This will produce an scss file at `ts/gen/custom-icons.scss`
### typescript
In order to work on the TypeScript code you'll need to install the dependencies locally in your dev environment:
```
nix develop
pnpm install
```
You can then generate the TypeScript with:
```
pnpm watch
```
The only page that works right now is `https://sync.nidus.cloud/template-test`
### watchexec
For iterating on styles
```
watchexec -e *.scss sass scss/custom.scss:html/static/css/bootstrap.css
watchexec -e scss sass scss/custom.scss:static/gen/css/bootstrap.css
```

View file

@ -1,7 +1,7 @@
package api
import (
"encoding/json"
"context"
"fmt"
"io"
"net/http"
@ -9,84 +9,70 @@ import (
"strconv"
"time"
"github.com/Gleipnir-Technology/nidus-sync/config"
"github.com/Gleipnir-Technology/nidus-sync/db"
"github.com/Gleipnir-Technology/nidus-sync/db/models"
nhttp "github.com/Gleipnir-Technology/nidus-sync/http"
"github.com/Gleipnir-Technology/nidus-sync/lint"
"github.com/Gleipnir-Technology/nidus-sync/platform"
"github.com/Gleipnir-Technology/nidus-sync/platform/background"
"github.com/Gleipnir-Technology/nidus-sync/platform/file"
"github.com/aarondl/opt/omit"
"github.com/aarondl/opt/omitnull"
"github.com/go-chi/chi/v5"
"github.com/go-chi/render"
"github.com/google/uuid"
"github.com/Gleipnir-Technology/nidus-sync/platform/types"
"github.com/Gleipnir-Technology/nidus-sync/resource"
"github.com/Gleipnir-Technology/nidus-sync/version"
//"github.com/gorilla/mux"
"github.com/rs/zerolog/log"
)
func apiAudioPost(w http.ResponseWriter, r *http.Request, u platform.User) {
id := chi.URLParam(r, "uuid")
noteUUID, err := uuid.Parse(id)
if err != nil {
http.Error(w, "Failed to decode the uuid", http.StatusBadRequest)
return
}
/*
type renderer struct {
}
func (ren *renderer) Render(w http.ResponseWriter, r *http.Request) error {
return nil
}
*/
// In the best case scenario, the excellent github.com/pkg/errors package
// helps reveal information on the error, setting it on Err, and in the Render()
// method, using it to set the application-specific error code in AppCode.
type ResponseErr struct {
Error error `json:"-"` // low-level runtime error
HTTPStatusCode int `json:"-"` // http response status code
var payload NoteAudioPayload
body, err := io.ReadAll(r.Body)
if err != nil {
http.Error(w, "Failed to read the payload", http.StatusBadRequest)
return
}
if err := json.Unmarshal(body, &payload); err != nil {
//debugSaveRequest(body, err, "Audio note POST JSON decode error")
http.Error(w, "Failed to decode the payload", http.StatusBadRequest)
return
}
ctx := r.Context()
setter := models.NoteAudioSetter{
Created: omit.From(payload.Created),
CreatorID: omit.From(int32(u.ID)),
Deleted: omitnull.FromPtr(payload.Deleted),
DeletorID: omitnull.FromPtr(payload.DeletorID),
Duration: omit.From(payload.Duration),
OrganizationID: omit.From(u.Organization.ID()),
Transcription: omitnull.FromPtr(payload.Transcription),
TranscriptionUserEdited: omit.From(payload.TranscriptionUserEdited),
Version: omit.From(payload.Version),
UUID: omit.From(noteUUID),
}
if err := platform.NoteAudioCreate(ctx, u, setter); err != nil {
render.Render(w, r, errRender(err))
return
}
w.WriteHeader(http.StatusAccepted)
StatusText string `json:"status"` // user-level status message
AppCode int64 `json:"code,omitempty"` // application-specific error code
ErrorText string `json:"error,omitempty"` // application-level error message, for debugging
}
func apiAudioContentPost(w http.ResponseWriter, r *http.Request, user platform.User) {
u_str := chi.URLParam(r, "uuid")
u, err := uuid.Parse(u_str)
if err != nil {
http.Error(w, "Failed to parse image UUID", http.StatusBadRequest)
return
}
err = file.FileContentWrite(r.Body, file.CollectionAudioRaw, u)
if err != nil {
log.Printf("Failed to write content file: %v", err)
http.Error(w, "failed to write content file", http.StatusInternalServerError)
}
ctx := r.Context()
a, err := models.NoteAudios.Query(
models.SelectWhere.NoteAudios.UUID.EQ(u),
models.SelectWhere.NoteAudios.OrganizationID.EQ(user.Organization.ID()),
).One(ctx, db.PGInstance.BobDB)
background.NewAudioTranscode(ctx, db.PGInstance.BobDB, a.ID)
w.WriteHeader(http.StatusOK)
func (e *ResponseErr) Render(w http.ResponseWriter, r *http.Request) error {
http.Error(w, e.StatusText, e.HTTPStatusCode)
return nil
}
func errRender(err error) *ResponseErr {
log.Error().Err(err).Msg("Rendering error")
return &ResponseErr{
Error: err,
HTTPStatusCode: 500,
StatusText: "Error rendering response",
ErrorText: err.Error(),
}
}
type Renderable interface {
Render(http.ResponseWriter, *http.Request) error
}
func renderShim(w http.ResponseWriter, r *http.Request, renderer Renderable) error {
return renderer.Render(w, r)
}
func renderList(w http.ResponseWriter, r *http.Request, data []Renderable) error {
return nil
}
func handleClientIos(w http.ResponseWriter, r *http.Request, u platform.User) {
var sinceStr string
err := r.ParseForm()
if err != nil {
render.Render(w, r, errRender(fmt.Errorf("Failed to parse GET form: %w", err)))
err = renderShim(w, r, errRender(fmt.Errorf("Failed to parse GET form: %w", err)))
if err != nil {
http.Error(w, fmt.Sprintf("render shim: %v", err), http.StatusInternalServerError)
}
return
} else {
sinceStr = r.FormValue("since")
@ -98,14 +84,20 @@ func handleClientIos(w http.ResponseWriter, r *http.Request, u platform.User) {
} else {
since, err = parseTime(sinceStr)
if err != nil {
render.Render(w, r, errRender(fmt.Errorf("Failed to parse 'since' value: %w", err)))
err = renderShim(w, r, errRender(fmt.Errorf("Failed to parse 'since' value: %w", err)))
if err != nil {
http.Error(w, fmt.Sprintf("render shim: %v", err), http.StatusInternalServerError)
}
return
}
}
csync, err := platform.ContentClientIos(r.Context(), u, since)
if err != nil {
render.Render(w, r, errRender(err))
err = renderShim(w, r, errRender(err))
if err != nil {
http.Error(w, fmt.Sprintf("render shim: %v", err), http.StatusInternalServerError)
}
return
}
@ -119,8 +111,11 @@ func handleClientIos(w http.ResponseWriter, r *http.Request, u platform.User) {
Fieldseeker: toResponseFieldseeker(csync.Fieldseeker),
Since: since_used,
}
if err := render.Render(w, r, response); err != nil {
render.Render(w, r, errRender(err))
if err := renderShim(w, r, response); err != nil {
err = renderShim(w, r, errRender(err))
if err != nil {
http.Error(w, fmt.Sprintf("render shim: %v", err), http.StatusInternalServerError)
}
return
}
}
@ -128,7 +123,10 @@ func handleClientIos(w http.ResponseWriter, r *http.Request, u platform.User) {
func apiMosquitoSource(w http.ResponseWriter, r *http.Request, u platform.User) {
bounds, err := parseBounds(r)
if err != nil {
render.Render(w, r, errRender(err))
err = renderShim(w, r, errRender(err))
if err != nil {
http.Error(w, fmt.Sprintf("render shim: %v", err), http.StatusInternalServerError)
}
return
}
@ -137,23 +135,32 @@ func apiMosquitoSource(w http.ResponseWriter, r *http.Request, u platform.User)
query.Limit = 100
sources, err := platform.MosquitoSourceQuery()
if err != nil {
render.Render(w, r, errRender(err))
err = renderShim(w, r, errRender(err))
if err != nil {
http.Error(w, fmt.Sprintf("render shim: %v", err), http.StatusInternalServerError)
}
return
}
data := []render.Renderer{}
data := []Renderable{}
for _, s := range sources {
data = append(data, NewResponseMosquitoSource(s))
}
if err := render.RenderList(w, r, data); err != nil {
render.Render(w, r, errRender(err))
if err := renderList(w, r, data); err != nil {
err = renderShim(w, r, errRender(err))
if err != nil {
http.Error(w, fmt.Sprintf("render shim: %v", err), http.StatusInternalServerError)
}
}
}
func apiTrapData(w http.ResponseWriter, r *http.Request, u platform.User) {
bounds, err := parseBounds(r)
if err != nil {
render.Render(w, r, errRender(err))
err = renderShim(w, r, errRender(err))
if err != nil {
http.Error(w, fmt.Sprintf("render shim: %v", err), http.StatusInternalServerError)
}
return
}
@ -162,23 +169,32 @@ func apiTrapData(w http.ResponseWriter, r *http.Request, u platform.User) {
query.Limit = 100
trap_data, err := platform.TrapDataQuery()
if err != nil {
render.Render(w, r, errRender(err))
err = renderShim(w, r, errRender(err))
if err != nil {
http.Error(w, fmt.Sprintf("render shim: %v", err), http.StatusInternalServerError)
}
return
}
data := []render.Renderer{}
data := []Renderable{}
for _, td := range trap_data {
data = append(data, NewResponseTrapDatum(td))
}
if err := render.RenderList(w, r, data); err != nil {
render.Render(w, r, errRender(err))
if err := renderList(w, r, data); err != nil {
err = renderShim(w, r, errRender(err))
if err != nil {
http.Error(w, fmt.Sprintf("render shim: %v", err), http.StatusInternalServerError)
}
}
}
func apiServiceRequest(w http.ResponseWriter, r *http.Request, u platform.User) {
bounds, err := parseBounds(r)
if err != nil {
render.Render(w, r, errRender(err))
err = renderShim(w, r, errRender(err))
if err != nil {
http.Error(w, fmt.Sprintf("render shim: %v", err), http.StatusInternalServerError)
}
return
}
query := db.NewGeoQuery()
@ -186,16 +202,22 @@ func apiServiceRequest(w http.ResponseWriter, r *http.Request, u platform.User)
query.Limit = 100
requests, err := platform.ServiceRequestQuery()
if err != nil {
render.Render(w, r, errRender(err))
err = renderShim(w, r, errRender(err))
if err != nil {
http.Error(w, fmt.Sprintf("render shim: %v", err), http.StatusInternalServerError)
}
return
}
data := []render.Renderer{}
data := []Renderable{}
for _, sr := range requests {
data = append(data, NewResponseServiceRequest(sr))
data = append(data, types.ServiceRequestFromModel(sr))
}
if err := render.RenderList(w, r, data); err != nil {
render.Render(w, r, errRender(err))
if err := renderList(w, r, data); err != nil {
err = renderShim(w, r, errRender(err))
if err != nil {
http.Error(w, fmt.Sprintf("render shim: %v", err), http.StatusInternalServerError)
}
}
}
@ -236,16 +258,6 @@ func parseBounds(r *http.Request) (*db.GeoBounds, error) {
return &bounds, nil
}
func errRender(err error) render.Renderer {
log.Error().Err(err).Msg("Rendering error")
return &ResponseErr{
Error: err,
HTTPStatusCode: 500,
StatusText: "Error rendering response",
ErrorText: err.Error(),
}
}
func webhookFieldseeker(w http.ResponseWriter, r *http.Request) {
// Create or open the log file
file, err := os.OpenFile("webhook/request.log", os.O_CREATE|os.O_WRONLY|os.O_APPEND, 0644)
@ -254,17 +266,32 @@ func webhookFieldseeker(w http.ResponseWriter, r *http.Request) {
http.Error(w, "Internal Server Error", http.StatusInternalServerError)
return
}
defer file.Close()
defer lint.LogOnErr(file.Close, "close request log")
// Write timestamp
timestamp := time.Now().Format("2006-01-02 15:04:05")
fmt.Fprintf(file, "\n=== Request logged at %s ===\n", timestamp)
_, err = fmt.Fprintf(file, "\n=== Request logged at %s ===\n", timestamp)
if err != nil {
log.Error().Err(err).Msg("writing response")
http.Error(w, "Internal server error", http.StatusInternalServerError)
return
}
// Write request line
fmt.Fprintf(file, "%s %s %s\n", r.Method, r.RequestURI, r.Proto)
_, err = fmt.Fprintf(file, "%s %s %s\n", r.Method, r.RequestURI, r.Proto)
if err != nil {
log.Error().Err(err).Msg("writing response")
http.Error(w, "Internal server error", http.StatusInternalServerError)
return
}
// Write all headers
fmt.Fprintf(file, "\nHeaders:\n")
_, err = fmt.Fprintf(file, "\nHeaders:\n")
if err != nil {
log.Error().Err(err).Msg("writing response")
http.Error(w, "Internal server error", http.StatusInternalServerError)
return
}
for name, values := range r.Header {
for _, value := range values {
fmt.Fprintf(file, "%s: %s\n", name, value)
@ -272,13 +299,29 @@ func webhookFieldseeker(w http.ResponseWriter, r *http.Request) {
}
// Write body
fmt.Fprintf(file, "\nBody:\n")
_, err = fmt.Fprintf(file, "\nBody:\n")
if err != nil {
log.Error().Err(err).Msg("writing response")
http.Error(w, "Internal server error", http.StatusInternalServerError)
return
}
body, err := io.ReadAll(r.Body)
if err != nil {
log.Printf("Error reading request body: %v", err)
fmt.Fprintf(file, "Error reading body: %v\n", err)
_, err = fmt.Fprintf(file, "Error reading body: %v\n", err)
if err != nil {
log.Error().Err(err).Msg("writing response")
http.Error(w, "Internal server error", http.StatusInternalServerError)
return
}
} else {
file.Write(body)
_, err = file.Write(body)
if err != nil {
log.Error().Err(err).Msg("writing response")
http.Error(w, "Internal server error", http.StatusInternalServerError)
return
}
if len(body) == 0 {
fmt.Fprintf(file, "(empty body)")
}
@ -300,3 +343,27 @@ func parseTime(x string) (*time.Time, error) {
created := time.UnixMilli(created_epoch)
return &created, nil
}
type about struct {
Environment string `json:"environment"`
SentryDSN string `json:"sentry_dsn"`
Tegola tegolaURLs `json:"tegola"`
Version version.VersionInfo `json:"version"`
}
type tegolaURLs struct {
Nidus string `json:"nidus"`
RMO string `json:"rmo"`
}
func getRoot(ctx context.Context, r *http.Request, q resource.QueryParams) (*about, *nhttp.ErrorWithStatus) {
v := version.Get()
return &about{
Environment: config.Environment,
SentryDSN: config.SentryDSNFrontend,
Tegola: tegolaURLs{
Nidus: config.MakeURLTegola("/maps/nidus/{z}/{x}/{y}?id={organization_id}"),
RMO: config.MakeURLTegola("/maps/rmo/{z}/{x}/{y}"),
},
Version: v,
}, nil
}

93
api/audio.go Normal file
View file

@ -0,0 +1,93 @@
package api
import (
"encoding/json"
"io"
"net/http"
"github.com/Gleipnir-Technology/nidus-sync/db"
"github.com/Gleipnir-Technology/nidus-sync/db/models"
"github.com/Gleipnir-Technology/nidus-sync/platform"
"github.com/Gleipnir-Technology/nidus-sync/platform/background"
"github.com/Gleipnir-Technology/nidus-sync/platform/file"
"github.com/aarondl/opt/omit"
"github.com/aarondl/opt/omitnull"
"github.com/google/uuid"
"github.com/gorilla/mux"
"github.com/rs/zerolog/log"
)
func apiAudioPost(w http.ResponseWriter, r *http.Request, u platform.User) {
vars := mux.Vars(r)
id := vars["uuid"]
noteUUID, err := uuid.Parse(id)
if err != nil {
http.Error(w, "Failed to decode the uuid", http.StatusBadRequest)
return
}
var payload NoteAudioPayload
body, err := io.ReadAll(r.Body)
if err != nil {
http.Error(w, "Failed to read the payload", http.StatusBadRequest)
return
}
if err := json.Unmarshal(body, &payload); err != nil {
//debugSaveRequest(body, err, "Audio note POST JSON decode error")
http.Error(w, "Failed to decode the payload", http.StatusBadRequest)
return
}
ctx := r.Context()
setter := models.NoteAudioSetter{
Created: omit.From(payload.Created),
CreatorID: omit.From(int32(u.ID)),
Deleted: omitnull.FromPtr(payload.Deleted),
DeletorID: omitnull.FromPtr(payload.DeletorID),
Duration: omit.From(payload.Duration),
OrganizationID: omit.From(u.Organization.ID),
Transcription: omitnull.FromPtr(payload.Transcription),
TranscriptionUserEdited: omit.From(payload.TranscriptionUserEdited),
Version: omit.From(payload.Version),
UUID: omit.From(noteUUID),
}
if err := platform.NoteAudioCreate(ctx, u, setter); err != nil {
renderShim(w, r, errRender(err))
return
}
w.WriteHeader(http.StatusAccepted)
}
func apiAudioContentPost(w http.ResponseWriter, r *http.Request, user platform.User) {
vars := mux.Vars(r)
u_str := vars["uuid"]
u, err := uuid.Parse(u_str)
if err != nil {
http.Error(w, "Failed to parse image UUID", http.StatusBadRequest)
return
}
err = file.FileContentWrite(r.Body, file.CollectionAudioRaw, u)
if err != nil {
log.Printf("Failed to write content file: %v", err)
http.Error(w, "failed to write content file", http.StatusInternalServerError)
return
}
ctx := r.Context()
a, err := models.NoteAudios.Query(
models.SelectWhere.NoteAudios.UUID.EQ(u),
models.SelectWhere.NoteAudios.OrganizationID.EQ(user.Organization.ID),
).One(ctx, db.PGInstance.BobDB)
if err != nil {
log.Printf("Failed to get note audio %s for org %d: %w", u_str, user.Organization.ID, err)
http.Error(w, "failed to update database", http.StatusBadRequest)
return
}
err = background.NewAudioTranscode(ctx, db.PGInstance.BobDB, a.ID)
if err != nil {
log.Printf("Failed to transcode audio %s for org %d: %w", u_str, user.Organization.ID, err)
http.Error(w, "failed to transcode audio", http.StatusBadRequest)
return
}
w.WriteHeader(http.StatusOK)
}

1
api/avatar.go Normal file
View file

@ -0,0 +1 @@
package api

View file

@ -1,67 +1 @@
package api
import (
"context"
"net/http"
"slices"
"time"
"github.com/Gleipnir-Technology/nidus-sync/config"
nhttp "github.com/Gleipnir-Technology/nidus-sync/http"
"github.com/Gleipnir-Technology/nidus-sync/platform"
"github.com/Gleipnir-Technology/nidus-sync/platform/publicreport"
"github.com/Gleipnir-Technology/nidus-sync/platform/types"
"github.com/google/uuid"
//"github.com/rs/zerolog/log"
)
type communication struct {
Created time.Time `json:"created"`
ID string `json:"id"`
PublicReport types.PublicReport `json:"public_report"`
Type string `json:"type"`
}
type contentListCommunication struct {
Communications []communication `json:"communications"`
}
func listCommunication(ctx context.Context, r *http.Request, user platform.User, query queryParams) (*contentListCommunication, *nhttp.ErrorWithStatus) {
reports, err := publicreport.ReportsForOrganization(ctx, user.Organization.ID())
if err != nil {
return nil, nhttp.NewError("nuisance report query: %w", err)
}
comms := make([]communication, len(reports))
for i, report := range reports {
comms[i] = communication{
Created: report.Created,
ID: report.PublicID,
PublicReport: report,
Type: "publicreport." + string(report.Type),
}
}
_by_created := func(a, b communication) int {
if a.Created == b.Created {
return 0
} else if a.Created.Before(b.Created) {
return 1
} else {
return -1
}
}
slices.SortFunc(comms, _by_created)
return &contentListCommunication{
Communications: comms,
}, nil
}
func toImageURLs(m map[string][]uuid.UUID, id string) []string {
uuids, ok := m[id]
if !ok {
return []string{}
}
urls := make([]string, len(uuids))
for i, u := range uuids {
urls[i] = config.MakeURLNidus("/api/image/%s/content", u.String())
}
return urls
}

View file

@ -13,14 +13,15 @@ import (
"github.com/Gleipnir-Technology/bob/dialect/psql/sm"
"github.com/Gleipnir-Technology/nidus-sync/db"
"github.com/Gleipnir-Technology/nidus-sync/platform"
"github.com/go-chi/chi/v5"
"github.com/gorilla/mux"
"github.com/paulmach/orb/geojson"
"github.com/rs/zerolog/log"
"github.com/stephenafamo/scan"
)
func getComplianceRequestImagePool(w http.ResponseWriter, r *http.Request) {
code := chi.URLParam(r, "public_id")
vars := mux.Vars(r)
code := vars["public_id"]
if code == "" {
http.Error(w, "empty public_id", http.StatusBadRequest)
return

View file

@ -1,4 +1,4 @@
package sync
package api
import (
"context"
@ -7,7 +7,8 @@ import (
"github.com/Gleipnir-Technology/bob/dialect/psql"
"github.com/Gleipnir-Technology/bob/dialect/psql/um"
"github.com/Gleipnir-Technology/nidus-sync/db"
"github.com/Gleipnir-Technology/nidus-sync/db/models"
"github.com/Gleipnir-Technology/nidus-sync/db/gen/nidus-sync/arcgis/model"
queryarcgis "github.com/Gleipnir-Technology/nidus-sync/db/query/arcgis"
"github.com/Gleipnir-Technology/nidus-sync/html"
nhttp "github.com/Gleipnir-Technology/nidus-sync/http"
"github.com/Gleipnir-Technology/nidus-sync/platform"
@ -25,9 +26,9 @@ type contentSettingOrganization struct {
}
type contentSettingIntegration struct {
ArcGISAccount *models.ArcgisAccount
ArcGISOAuth *models.ArcgisOauthToken
ServiceMaps []*models.ArcgisServiceMap
ArcGISAccount *model.Account
ArcGISOAuth *model.OAuthToken
ServiceMaps []model.ServiceMap
}
func getConfigurationOrganization(ctx context.Context, r *http.Request, u platform.User) (*html.Response[contentSettingOrganization], *nhttp.ErrorWithStatus) {
@ -82,23 +83,21 @@ func getConfigurationIntegrationArcgis(ctx context.Context, r *http.Request, u p
if err != nil {
return nil, nhttp.NewError("Failed to get oauth: %w", err)
}
var account *models.ArcgisAccount
var service_maps []*models.ArcgisServiceMap
var account model.Account
var service_maps []model.ServiceMap
account_id := u.Organization.ArcgisAccountID()
if account_id != "" {
account, err = models.FindArcgisAccount(ctx, db.PGInstance.BobDB, account_id)
account, err = queryarcgis.AccountFromID(ctx, account_id)
if err != nil {
return nil, nhttp.NewError("Failed to get arcgis: %w", err)
}
service_maps, err = models.ArcgisServiceMaps.Query(
models.SelectWhere.ArcgisServiceMaps.AccountID.EQ(account.ID),
).All(ctx, db.PGInstance.BobDB)
service_maps, err = queryarcgis.ServiceMapsFromAccountID(ctx, account.ID)
if err != nil {
return nil, nhttp.NewError("Failed to get map services: %w", err)
}
}
data := contentSettingIntegration{
ArcGISAccount: account,
ArcGISAccount: &account,
ArcGISOAuth: oauth,
ServiceMaps: service_maps,
}
@ -133,12 +132,12 @@ func postConfigurationIntegrationArcgis(ctx context.Context, r *http.Request, u
_, err := psql.Update(
um.Table("organization"),
um.SetCol("arcgis_map_service_id").ToArg(f.MapService),
um.Where(psql.Quote("id").EQ(psql.Arg(u.Organization.ID()))),
um.Where(psql.Quote("id").EQ(psql.Arg(u.Organization.ID))),
).Exec(ctx, db.PGInstance.BobDB)
if err != nil {
return "", nhttp.NewError("Failed to update map service config: %w", err)
}
log.Info().Str("map-service", *f.MapService).Int32("org-id", u.Organization.ID()).Msg("changed map service")
log.Info().Str("map-service", *f.MapService).Int32("org-id", u.Organization.ID).Msg("changed map service")
} else {
log.Info().Msg("no map service")
}

View file

@ -5,6 +5,7 @@ import (
"net/http"
"os"
"github.com/Gleipnir-Technology/nidus-sync/lint"
"github.com/rs/zerolog/log"
)
@ -14,7 +15,7 @@ func debugSaveRequest(r *http.Request) {
log.Error().Err(err).Msg("failed to create temp file for debugSaveRequest")
return
}
defer tmpFile.Close()
defer lint.LogOnErr(tmpFile.Close, "close temp file")
_, err = io.Copy(tmpFile, r.Body)
if err != nil {

View file

@ -9,15 +9,14 @@ import (
"github.com/Gleipnir-Technology/nidus-sync/db/models"
"github.com/Gleipnir-Technology/nidus-sync/platform"
"github.com/Gleipnir-Technology/nidus-sync/platform/file"
"github.com/go-chi/chi/v5"
"github.com/go-chi/render"
"github.com/gorilla/mux"
)
func apiGetDistrict(w http.ResponseWriter, r *http.Request) {
var latStr, lngStr string
err := r.ParseForm()
if err != nil {
render.Render(w, r, errRender(fmt.Errorf("Failed to parse GET form: %w", err)))
renderShim(w, r, errRender(fmt.Errorf("Failed to parse GET form: %w", err)))
return
} else {
latStr = r.FormValue("lat")
@ -25,17 +24,17 @@ func apiGetDistrict(w http.ResponseWriter, r *http.Request) {
}
lat, err := strconv.ParseFloat(latStr, 64)
if err != nil {
render.Render(w, r, errRender(fmt.Errorf("Failed to parse lat as float: %w", err)))
renderShim(w, r, errRender(fmt.Errorf("Failed to parse lat as float: %w", err)))
return
}
lng, err := strconv.ParseFloat(lngStr, 64)
if err != nil {
render.Render(w, r, errRender(fmt.Errorf("Failed to parse lng as float: %w", err)))
renderShim(w, r, errRender(fmt.Errorf("Failed to parse lng as float: %w", err)))
return
}
org, err := platform.DistrictForLocation(r.Context(), lng, lat)
if err != nil {
render.Render(w, r, errRender(fmt.Errorf("Failed to get district: %w", err)))
renderShim(w, r, errRender(fmt.Errorf("Failed to get district: %w", err)))
return
}
if org == nil {
@ -48,13 +47,14 @@ func apiGetDistrict(w http.ResponseWriter, r *http.Request) {
Phone: org.OfficePhone.GetOr(""),
Website: org.Website.GetOr(""),
}
if err := render.Render(w, r, d); err != nil {
render.Render(w, r, errRender(err))
if err := renderShim(w, r, d); err != nil {
renderShim(w, r, errRender(err))
}
}
func apiGetDistrictLogo(w http.ResponseWriter, r *http.Request) {
slug := chi.URLParam(r, "slug")
vars := mux.Vars(r)
slug := vars["slug"]
ctx := r.Context()
rows, err := models.Organizations.Query(
models.SelectWhere.Organizations.Slug.EQ(slug),
@ -73,7 +73,7 @@ func apiGetDistrictLogo(w http.ResponseWriter, r *http.Request) {
http.Error(w, "Logo not found", http.StatusNotFound)
return
}
file.ImageFileContentWriteLogo(w, org.LogoUUID.MustGet())
file.ImageFileToWriter(file.CollectionLogo, org.LogoUUID.MustGet(), w)
return
default:
http.Error(w, "Too many organizations, this is a programmer error", http.StatusInternalServerError)

View file

@ -7,17 +7,20 @@ import (
"time"
"github.com/Gleipnir-Technology/nidus-sync/platform"
"github.com/Gleipnir-Technology/nidus-sync/platform/event"
"github.com/Gleipnir-Technology/nidus-sync/version"
"github.com/google/uuid"
"github.com/rs/zerolog/log"
)
var connectionsSSE map[*ConnectionSSE]bool = make(map[*ConnectionSSE]bool, 0)
var TYPE_STATUS string = "status"
type ConnectionSSE struct {
chanEvent chan platform.Event
id uuid.UUID
organizationID int32
userID int
userID int32
}
type Message struct {
@ -27,7 +30,25 @@ type Message struct {
URI string `json:"uri"`
}
type Status struct {
BuildTime time.Time `json:"build_time"`
IsModified bool `json:"is_modified"`
Revision string `json:"revision"`
Status string `json:"status"`
Type string `json:"type"`
}
func (c *ConnectionSSE) SendEvent(w http.ResponseWriter, m platform.Event) error {
if m.Type == event.EventTypeShutdown {
v := version.Get()
return send(w, Status{
BuildTime: v.BuildTime,
IsModified: v.IsModified,
Revision: v.Revision,
Status: m.Type.String(),
Type: TYPE_STATUS,
})
}
return send(w, Message{
Resource: m.Resource,
Time: m.Time,
@ -46,10 +67,13 @@ func (c *ConnectionSSE) SendHeartbeat(w http.ResponseWriter, t time.Time) error
func SetEventChannel(chan_envelopes <-chan platform.Envelope) {
go func() {
for envelope := range chan_envelopes {
for conn, _ := range connectionsSSE {
if conn.organizationID == envelope.OrganizationID {
for conn := range connectionsSSE {
if conn.organizationID == envelope.OrganizationID || envelope.OrganizationID == 0 {
log.Debug().Int("type", int(envelope.Event.Type)).Int32("env-org", envelope.OrganizationID).Msg("pushed event to client")
conn.chanEvent <- envelope.Event
} else if conn.userID == envelope.UserID {
log.Debug().Int("type", int(envelope.Event.Type)).Int32("env-user", envelope.UserID).Msg("pushed event to user")
conn.chanEvent <- envelope.Event
} else {
log.Debug().Int("type", int(envelope.Event.Type)).Int32("env-org", envelope.OrganizationID).Int32("conn-org", conn.organizationID).Msg("skipped event, bad org")
}
@ -58,6 +82,7 @@ func SetEventChannel(chan_envelopes <-chan platform.Envelope) {
}
}()
}
func send[T any](w http.ResponseWriter, msg T) error {
jsonData, err := json.Marshal(msg)
if err != nil {
@ -82,18 +107,35 @@ func streamEvents(w http.ResponseWriter, r *http.Request, u platform.User) {
uid, err := uuid.NewUUID()
if err != nil {
log.Error().Err(err).Msg("failed to create uuid")
http.Error(w, "failed to create uuid", http.StatusInternalServerError)
return
}
connection := ConnectionSSE{
chanEvent: make(chan platform.Event),
id: uid,
organizationID: u.Organization.ID(),
userID: u.ID,
organizationID: u.Organization.ID,
userID: int32(u.ID),
}
connectionsSSE[&connection] = true
log.Debug().Int32("org", u.Organization.ID()).Int("user", u.ID).Str("id", uid.String()).Msg("connected SSE client")
log.Debug().Int32("org", u.Organization.ID).Int("user", u.ID).Str("id", uid.String()).Msg("connected SSE client")
// Send an initial connected event
fmt.Fprintf(w, "event: connected\ndata: {\"status\": \"connected\", \"time\": \"%s\"}\n\n", time.Now().Format(time.RFC3339))
v := version.Get()
status := Status{
BuildTime: v.BuildTime,
IsModified: v.IsModified,
Revision: v.Revision,
Status: "connected",
Type: TYPE_STATUS,
}
body, err := json.Marshal(status)
if err != nil {
log.Error().Err(err).Msg("failed to marshal connect status")
http.Error(w, "failed to marshal connect status", http.StatusInternalServerError)
return
}
fmt.Fprintf(w, "data: %s\n\n", body)
w.(http.Flusher).Flush()
// Keep the connection open with a ticker sending periodic events
@ -107,7 +149,7 @@ func streamEvents(w http.ResponseWriter, r *http.Request, u platform.User) {
for {
select {
case <-done:
log.Debug().Int32("org", u.Organization.ID()).Int("user", u.ID).Str("id", uid.String()).Msg("Client closed connection")
log.Debug().Int32("org", u.Organization.ID).Int("user", u.ID).Str("id", uid.String()).Msg("Client closed connection")
delete(connectionsSSE, &connection)
return
case t := <-ticker.C:

View file

@ -8,99 +8,387 @@ import (
"net/http"
"github.com/Gleipnir-Technology/nidus-sync/auth"
"github.com/Gleipnir-Technology/nidus-sync/html"
nhttp "github.com/Gleipnir-Technology/nidus-sync/http"
"github.com/Gleipnir-Technology/nidus-sync/platform"
"github.com/Gleipnir-Technology/nidus-sync/platform/file"
"github.com/Gleipnir-Technology/nidus-sync/resource"
"github.com/google/uuid"
"github.com/gorilla/schema"
"github.com/rs/zerolog/log"
)
var decoder = schema.NewDecoder()
type handlerFunctionGet[T any] func(context.Context, *http.Request, platform.User, queryParams) (*T, *nhttp.ErrorWithStatus)
type wrappedHandler func(http.ResponseWriter, *http.Request)
type contentAuthenticated[T any] struct {
C T
Config html.ContentConfig
User platform.User
}
type ErrorAPI struct {
Message string `json:"message"`
}
func authenticatedHandlerJSON[T any](f handlerFunctionGet[T]) http.Handler {
var decoder = schema.NewDecoder()
type handlerBase func(context.Context, http.ResponseWriter, *http.Request) *nhttp.ErrorWithStatus
type handlerBaseAuthenticated func(context.Context, http.ResponseWriter, *http.Request, platform.User) *nhttp.ErrorWithStatus
type handlerFunctionDelete func(context.Context, *http.Request, platform.User) *nhttp.ErrorWithStatus
type handlerFunctionGet[T any] func(context.Context, *http.Request, resource.QueryParams) (*T, *nhttp.ErrorWithStatus)
type handlerFunctionGetAuthenticated[T any] func(context.Context, *http.Request, platform.User, resource.QueryParams) (*T, *nhttp.ErrorWithStatus)
type handlerFunctionGetImage func(context.Context, *http.Request, platform.User) (file.Collection, uuid.UUID, *nhttp.ErrorWithStatus)
type handlerFunctionGetSlice[T any] func(context.Context, *http.Request, resource.QueryParams) ([]*T, *nhttp.ErrorWithStatus)
type handlerFunctionGetSliceAuthenticated[T any] func(context.Context, *http.Request, platform.User, resource.QueryParams) ([]T, *nhttp.ErrorWithStatus)
type handlerFunctionPost[RequestType any, ResponseType any] func(context.Context, *http.Request, RequestType) (ResponseType, *nhttp.ErrorWithStatus)
type handlerFunctionPostAuthenticated[RequestType any, ResponseType any] func(context.Context, *http.Request, platform.User, RequestType) (ResponseType, *nhttp.ErrorWithStatus)
type handlerFunctionPostFormMultipart[RequestType any, ResponseType any] func(context.Context, *http.Request, RequestType) (*ResponseType, *nhttp.ErrorWithStatus)
type handlerFunctionPutAuthenticated[RequestType any] func(context.Context, *http.Request, platform.User, RequestType) (string, *nhttp.ErrorWithStatus)
func authenticatedHandlerBasic(f handlerBaseAuthenticated) http.Handler {
return auth.NewEnsureAuth(func(w http.ResponseWriter, r *http.Request, u platform.User) {
ctx := r.Context()
e := f(ctx, w, r, u)
if e != nil {
respondErrorStatus(w, e)
return
}
return
})
}
func authenticatedHandlerDelete(f handlerFunctionDelete) http.Handler {
return auth.NewEnsureAuth(func(w http.ResponseWriter, r *http.Request, u platform.User) {
ctx := r.Context()
e := f(ctx, r, u)
if e != nil {
respondErrorStatus(w, e)
return
}
http.Error(w, "", http.StatusNoContent)
return
})
}
func authenticatedHandlerGetImage(f handlerFunctionGetImage) http.Handler {
return auth.NewEnsureAuth(func(w http.ResponseWriter, r *http.Request, u platform.User) {
ctx := r.Context()
collection, uid, e := f(ctx, r, u)
if e != nil {
respondErrorStatus(w, e)
return
}
file.ImageFileToWriter(collection, uid, w)
})
}
func authenticatedHandlerJSON[T any](f handlerFunctionGetAuthenticated[T]) http.Handler {
return auth.NewEnsureAuth(func(w http.ResponseWriter, r *http.Request, u platform.User) {
ctx := r.Context()
var body []byte
var params queryParams
var params resource.QueryParams
err := decoder.Decode(&params, r.URL.Query())
if err != nil {
log.Error().Err(err).Msg("decode query failure")
http.Error(w, "failed to decode query", http.StatusInternalServerError)
respondErrorStatus(w, nhttp.NewBadRequest("failed to decode query: %w", err))
return
}
resp, e := f(ctx, r, u, params)
w.Header().Set("Content-Type", "application/json")
//log.Info().Str("template", template).Err(e).Msg("handler done")
if e != nil {
log.Warn().Int("status", e.Status).Err(e).Str("user message", e.Message).Msg("Responding with an error from api")
body, err = json.Marshal(ErrorAPI{Message: e.Error()})
if err != nil {
log.Error().Err(err).Msg("failed to marshal error")
http.Error(w, "{\"message\": \"boom. I can't even tell you what went wrong\"}", http.StatusInternalServerError)
return
}
http.Error(w, string(body), e.Status)
respondErrorStatus(w, e)
return
}
body, err = json.Marshal(resp)
if err != nil {
respondErrorStatus(w, nhttp.NewError("failed to marshal json: %w", err))
return
}
_, err = w.Write(body)
if err != nil {
respondErrorStatus(w, nhttp.NewError("failed to write json: %w", err))
return
}
})
}
func authenticatedHandlerJSONSlice[T any](f handlerFunctionGetSliceAuthenticated[T]) http.Handler {
return auth.NewEnsureAuth(func(w http.ResponseWriter, r *http.Request, u platform.User) {
ctx := r.Context()
var body []byte
var params resource.QueryParams
err := decoder.Decode(&params, r.URL.Query())
if err != nil {
respondErrorStatus(w, nhttp.NewBadRequest("failed to decode query: %w", err))
return
}
resp, e := f(ctx, r, u, params)
w.Header().Set("Content-Type", "application/json")
//log.Info().Str("template", template).Err(e).Msg("handler done")
if e != nil {
respondErrorStatus(w, e)
return
}
if resp == nil {
body, err = json.Marshal([]struct{}{})
} else {
body, err = json.Marshal(resp)
}
if err != nil {
respondErrorStatus(w, nhttp.NewError("failed to marshal json: %w", err))
return
}
_, err = w.Write(body)
if err != nil {
respondErrorStatus(w, nhttp.NewError("failed to write json: %w", err))
return
}
})
}
func authenticatedHandlerJSONPost[RequestType any, ResponseType any](f handlerFunctionPostAuthenticated[RequestType, ResponseType]) http.Handler {
return auth.NewEnsureAuth(func(w http.ResponseWriter, r *http.Request, u platform.User) {
w.Header().Set("Content-Type", "application/json")
req, e := parseRequest[RequestType](r)
if e != nil {
serializeError(w, e)
return
}
ctx := r.Context()
resp, e := f(ctx, r, u, *req)
if e != nil {
serializeError(w, e)
return
}
body, err := json.Marshal(resp)
if err != nil {
respondErrorStatus(w, nhttp.NewError("failed to marshal json: %w", err))
return
}
_, err = w.Write(body)
if err != nil {
respondErrorStatus(w, nhttp.NewError("failed to write json: %w", err))
return
}
})
}
func authenticatedHandlerJSONPut[RequestType any](f handlerFunctionPutAuthenticated[RequestType]) http.Handler {
return auth.NewEnsureAuth(func(w http.ResponseWriter, r *http.Request, u platform.User) {
w.Header().Set("Content-Type", "application/json")
req, e := parseRequest[RequestType](r)
if e != nil {
serializeError(w, e)
return
}
ctx := r.Context()
path, e := f(ctx, r, u, *req)
if e != nil {
serializeError(w, e)
return
}
if path == "" {
w.WriteHeader(http.StatusNoContent)
return
}
w.Header().Set("Location", path)
http.Redirect(w, r, path, http.StatusCreated)
})
}
func authenticatedHandlerPostMultipart[ResponseType any](f handlerFunctionPostAuthenticated[[]file.Upload, ResponseType], collection file.Collection) http.Handler {
return auth.NewEnsureAuth(func(w http.ResponseWriter, r *http.Request, u platform.User) {
err := r.ParseMultipartForm(32 << 10) // 32 MB buffer
if err != nil {
respondError(w, http.StatusBadRequest, "Failed to parse form: %w ", err)
return
}
uploads, err := file.SaveFileUploads(r, collection)
if err != nil {
respondError(w, http.StatusInternalServerError, "failed to save uploads: %w", err)
return
}
/*
err = decoder.Decode(&content, r.PostForm)
if err != nil {
respondError(w, http.StatusBadRequest, "Failed to decode form: %w", err)
return
}
*/
ctx := r.Context()
resp, e := f(ctx, r, u, uploads)
if e != nil {
http.Error(w, e.Error(), e.Status)
return
}
body, err := json.Marshal(resp)
if err != nil {
log.Error().Err(err).Msg("failed to marshal json")
http.Error(w, "{\"message\": \"failed to marshal json\"}", http.StatusInternalServerError)
return
}
w.Write(body)
})
}
type handlerFunctionPost[ReqType any, ResponseType any] func(context.Context, *http.Request, platform.User, ReqType) (ResponseType, *nhttp.ErrorWithStatus)
func authenticatedHandlerJSONPost[ReqType any, ResponseType any](f handlerFunctionPost[ReqType, ResponseType]) http.Handler {
return auth.NewEnsureAuth(func(w http.ResponseWriter, r *http.Request, u platform.User) {
w.Header().Set("Content-Type", "application/json")
var req ReqType
body, err := io.ReadAll(r.Body)
if err != nil {
respondError(w, http.StatusInternalServerError, "Failed to read body: %w", err)
func handlerBasic(f handlerBase) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
e := f(ctx, w, r)
if e != nil {
respondErrorStatus(w, e)
return
}
err = json.Unmarshal(body, &req)
}
}
func handlerJSON[T any](f handlerFunctionGet[T]) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
var body []byte
var params resource.QueryParams
err := decoder.Decode(&params, r.URL.Query())
if err != nil {
respondError(w, http.StatusBadRequest, "Failed to decode request: %w", err)
respondErrorStatus(w, nhttp.NewBadRequest("failed to decode query: %w", err))
return
}
resp, e := f(ctx, r, params)
w.Header().Set("Content-Type", "application/json")
//log.Info().Str("template", template).Err(e).Msg("handler done")
if e != nil {
respondErrorStatus(w, e)
return
}
body, err = json.Marshal(resp)
if err != nil {
respondErrorStatus(w, nhttp.NewError("failed to marshal json: %w", err))
return
}
w.Write(body)
}
}
func handlerJSONSlice[T any](f handlerFunctionGetSlice[T]) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
ctx := r.Context()
var body []byte
var params resource.QueryParams
err := decoder.Decode(&params, r.URL.Query())
if err != nil {
respondErrorStatus(w, nhttp.NewBadRequest("failed to decode query: %w", err))
return
}
resp, e := f(ctx, r, params)
w.Header().Set("Content-Type", "application/json")
//log.Info().Str("template", template).Err(e).Msg("handler done")
if e != nil {
respondErrorStatus(w, e)
return
}
body, err = json.Marshal(resp)
if err != nil {
respondErrorStatus(w, nhttp.NewError("failed to marshal json: %w", err))
return
}
w.Write(body)
}
}
func handlerJSONPost[RequestType any, ResponseType any](f handlerFunctionPost[RequestType, ResponseType]) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
req, e := parseRequest[RequestType](r)
if e != nil {
serializeError(w, e)
return
}
ctx := r.Context()
response, e := f(ctx, r, u, req)
resp, e := f(ctx, r, *req)
if e != nil {
log.Warn().Int("status", e.Status).Err(e).Str("user message", e.Message).Msg("Responding with an error from api")
body, err = json.Marshal(ErrorAPI{Message: e.Error()})
if err != nil {
log.Error().Err(err).Msg("failed to marshal error")
http.Error(w, "{\"message\": \"boom. I can't even tell you what went wrong\"}", http.StatusInternalServerError)
return
}
http.Error(w, string(body), e.Status)
serializeError(w, e)
return
}
resp_body, err := json.Marshal(response)
body, err := json.Marshal(resp)
if err != nil {
respondError(w, http.StatusInternalServerError, "Failed to marshal json response: %w", err)
respondErrorStatus(w, nhttp.NewError("failed to marshal json: %w", err))
return
}
w.Write(resp_body)
})
w.Write(body)
}
}
func handlerJSONPut[RequestType any, ResponseType any](f handlerFunctionPost[RequestType, ResponseType]) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
req, e := parseRequest[RequestType](r)
if e != nil {
serializeError(w, e)
return
}
ctx := r.Context()
resp, e := f(ctx, r, *req)
if e != nil {
serializeError(w, e)
return
}
body, err := json.Marshal(resp)
if err != nil {
respondErrorStatus(w, nhttp.NewError("failed to marshal json: %w", err))
return
}
w.Write(body)
}
}
func handlerFormPost[RequestType any, ResponseType any](f handlerFunctionPostFormMultipart[RequestType, ResponseType]) http.HandlerFunc {
return func(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
err := r.ParseMultipartForm(32 << 12) // 128 MB buffer
if err != nil {
respondErrorStatus(w, nhttp.NewBadRequest("bad form: %w", err))
return
}
var req RequestType
err = decoder.Decode(&req, r.PostForm)
if err != nil {
respondErrorStatus(w, nhttp.NewBadRequest("decode form: %w", err))
return
}
ctx := r.Context()
resp, e := f(ctx, r, req)
if e != nil {
serializeError(w, e)
return
}
body, err := json.Marshal(resp)
if err != nil {
respondErrorStatus(w, nhttp.NewError("failed to marshal json: %w", err))
return
}
w.Write(body)
}
}
func parseRequest[RequestType any](r *http.Request) (*RequestType, *nhttp.ErrorWithStatus) {
var err error
var req RequestType
content_type := r.Header.Get("Content-Type")
switch content_type {
case "application/json":
body, e := io.ReadAll(r.Body)
if e != nil {
return nil, nhttp.NewError("Failed to read body: %w", err)
}
err = json.Unmarshal(body, &req)
case "application/x-www-form-urlencoded":
e := r.ParseForm()
if err != nil {
return nil, nhttp.NewBadRequest("parsing form: %w", e)
}
err = decoder.Decode(&req, r.PostForm)
default:
return nil, nhttp.NewBadRequest("unrecognized content type '%s'", content_type)
}
if err != nil {
return nil, nhttp.NewErrorStatus(http.StatusBadRequest, "Failed to decode request: %w", err)
}
return &req, nil
}
func serializeError(w http.ResponseWriter, e *nhttp.ErrorWithStatus) {
log.Warn().Int("status", e.Status).Err(e).Str("user message", e.Message).Msg("Responding with an error from api")
body, err := json.Marshal(ErrorAPI{Message: e.Error()})
if err != nil {
log.Error().Err(err).Msg("failed to marshal error")
http.Error(w, "{\"message\": \"boom. I can't even tell you what went wrong\"}", http.StatusInternalServerError)
return
}
http.Error(w, string(body), e.Status)
return
}
func respondError(w http.ResponseWriter, status int, format string, args ...any) {
outer_err := fmt.Errorf(format, args...)
body, err := json.Marshal(ErrorAPI{
@ -112,3 +400,13 @@ func respondError(w http.ResponseWriter, status int, format string, args ...any)
}
http.Error(w, string(body), status)
}
func respondErrorStatus(w http.ResponseWriter, e *nhttp.ErrorWithStatus) {
log.Warn().Int("status", e.Status).Err(e).Str("user message", e.Message).Msg("Responding with an error from api")
body, err := json.Marshal(ErrorAPI{Message: e.Error()})
if err != nil {
log.Error().Err(err).Msg("failed to marshal error")
http.Error(w, "{\"message\": \"boom. I can't even tell you what went wrong\"}", http.StatusInternalServerError)
return
}
http.Error(w, string(body), e.Status)
}

View file

@ -11,14 +11,14 @@ import (
"github.com/Gleipnir-Technology/nidus-sync/platform/file"
"github.com/aarondl/opt/omit"
"github.com/aarondl/opt/omitnull"
"github.com/go-chi/chi/v5"
"github.com/go-chi/render"
"github.com/google/uuid"
"github.com/gorilla/mux"
"github.com/rs/zerolog/log"
)
func apiImagePost(w http.ResponseWriter, r *http.Request, u platform.User) {
id := chi.URLParam(r, "uuid")
vars := mux.Vars(r)
id := vars["uuid"]
noteUUID, err := uuid.Parse(id)
if err != nil {
http.Error(w, "Failed to decode the uuid", http.StatusBadRequest)
@ -38,41 +38,44 @@ func apiImagePost(w http.ResponseWriter, r *http.Request, u platform.User) {
}
ctx := r.Context()
setter := models.NoteImageSetter{
Created: omit.From(payload.Created),
CreatorID: omit.From(int32(u.ID)),
Deleted: omitnull.FromPtr(payload.Deleted),
DeletorID: omitnull.FromPtr(payload.DeletorID),
Version: omit.From(payload.Version),
UUID: omit.From(noteUUID),
Created: omit.From(payload.Created),
CreatorID: omit.From(int32(u.ID)),
Deleted: omitnull.FromPtr(payload.Deleted),
DeletorID: omitnull.FromPtr(payload.DeletorID),
OrganizationID: omit.From(u.Organization.ID),
Version: omit.From(payload.Version),
UUID: omit.From(noteUUID),
}
err = platform.NoteImageCreate(ctx, u, setter)
if err != nil {
render.Render(w, r, errRender(err))
renderShim(w, r, errRender(err))
return
}
w.WriteHeader(http.StatusAccepted)
}
func apiImageContentGet(w http.ResponseWriter, r *http.Request, u platform.User) {
u_str := chi.URLParam(r, "uuid")
vars := mux.Vars(r)
u_str := vars["uuid"]
imageUUID, err := uuid.Parse(u_str)
if err != nil {
log.Error().Err(err).Msg("Failed to parse image UUID")
http.Error(w, "Failed to parse image UUID", http.StatusBadRequest)
}
file.PublicImageFileToResponse(w, imageUUID)
file.ImageFileToWriter(file.CollectionPublicImage, imageUUID, w)
w.WriteHeader(http.StatusOK)
}
func apiImageContentPost(w http.ResponseWriter, r *http.Request, u platform.User) {
u_str := chi.URLParam(r, "uuid")
vars := mux.Vars(r)
u_str := vars["uuid"]
imageUUID, err := uuid.Parse(u_str)
if err != nil {
log.Error().Err(err).Msg("Failed to parse image UUID")
http.Error(w, "Failed to parse image UUID", http.StatusBadRequest)
}
err = file.ImageFileContentWrite(imageUUID, r.Body)
err = file.ImageFileFromReader(file.CollectionImageRaw, imageUUID, r.Body)
if err != nil {
render.Render(w, r, errRender(err))
renderShim(w, r, errRender(err))
return
}
w.WriteHeader(http.StatusOK)

View file

@ -1,55 +1 @@
package api
import (
"context"
"net/http"
nhttp "github.com/Gleipnir-Technology/nidus-sync/http"
"github.com/Gleipnir-Technology/nidus-sync/platform"
)
type createLead struct {
PoolLocations map[int]platform.Location `json:"pool_locations"`
SignalIDs []int `json:"signal_ids"`
}
type createdLead struct {
ID int32 `json:"id"`
}
type contentListLead struct {
Leads []lead `json:"leads"`
}
type lead struct {
ID int32 `json:"id"`
}
func listLead(ctx context.Context, r *http.Request, user platform.User, query queryParams) (*contentListLead, *nhttp.ErrorWithStatus) {
return &contentListLead{
Leads: make([]lead, 0),
}, nil
}
func postLeads(ctx context.Context, r *http.Request, user platform.User, req createLead) (*createdLead, *nhttp.ErrorWithStatus) {
if len(req.SignalIDs) == 0 {
return nil, nhttp.NewErrorStatus(http.StatusBadRequest, "can't make a lead with no signals")
}
if len(req.SignalIDs) > 1 {
return nil, nhttp.NewErrorStatus(http.StatusBadRequest, "can't make a lead with multiple signals yet")
}
signal_id := req.SignalIDs[0]
var pool_location *platform.Location
l, ok := req.PoolLocations[signal_id]
if ok {
pool_location = &l
}
site_id, err := platform.SiteFromSignal(ctx, user, int32(signal_id))
if err != nil || site_id == nil {
return nil, nhttp.NewError("site from signal: %w", err)
}
lead_id, err := platform.LeadCreate(ctx, user, int32(signal_id), *site_id, pool_location)
if err != nil || lead_id == nil {
return nil, nhttp.NewError("lead create: %w", err)
}
return &createdLead{
ID: *lead_id,
}, nil
}

View file

@ -2,62 +2,49 @@ package api
import (
"context"
"fmt"
"net/http"
"strconv"
"github.com/Gleipnir-Technology/nidus-sync/config"
nhttp "github.com/Gleipnir-Technology/nidus-sync/http"
"github.com/Gleipnir-Technology/nidus-sync/platform"
)
type formPublicreportLead struct {
type formPublicreportSignal struct {
ReportID string `json:"reportID"`
}
func postPublicreportLead(ctx context.Context, r *http.Request, user platform.User, req formPublicreportLead) (*createdLead, *nhttp.ErrorWithStatus) {
lead_id, err := platform.LeadCreateFromPublicreport(ctx, user, req.ReportID)
func postPublicreportSignal(ctx context.Context, r *http.Request, user platform.User, req formPublicreportSignal) (string, *nhttp.ErrorWithStatus) {
signal_id, err := platform.SignalCreateFromPublicreport(ctx, user, req.ReportID)
if err != nil {
return nil, nhttp.NewError("create lead: %w", err)
return "", nhttp.NewError("create signal: %w", err)
}
return &createdLead{
ID: *lead_id,
}, nil
return fmt.Sprintf("/signal/%d", *signal_id), nil
}
type formPublicreportInvalid struct {
ReportID string `json:"reportID"`
}
type createdReport struct {
URI string `json:"uri"`
}
func postPublicreportInvalid(ctx context.Context, r *http.Request, user platform.User, req formPublicreportLead) (*createdReport, *nhttp.ErrorWithStatus) {
err := platform.PublicreportInvalid(ctx, user, req.ReportID)
func postPublicreportInvalid(ctx context.Context, r *http.Request, user platform.User, req formPublicreportSignal) (string, *nhttp.ErrorWithStatus) {
err := platform.PublicReportInvalid(ctx, user, req.ReportID)
if err != nil {
return nil, nhttp.NewError("create lead: %w", err)
return "", nhttp.NewError("create signal: %w", err)
}
return &createdReport{
URI: config.MakeURLNidus("/publicreport/%s", req.ReportID),
}, nil
return fmt.Sprintf("/publicreport/%s", req.ReportID), nil
}
type formPublicreportMessage struct {
Message string `json:"message"`
ReportID string `json:"reportID"`
}
type createdMessage struct {
URI string `json:"uri"`
}
func postPublicreportMessage(ctx context.Context, r *http.Request, user platform.User, req formPublicreportMessage) (*createdMessage, *nhttp.ErrorWithStatus) {
func postPublicreportMessage(ctx context.Context, r *http.Request, user platform.User, req formPublicreportMessage) (string, *nhttp.ErrorWithStatus) {
msg_id, err := platform.PublicReportMessageCreate(ctx, user, req.ReportID, req.Message)
if err != nil {
return nil, nhttp.NewError("failed to create message: %s", err)
return "", nhttp.NewError("failed to create message: %s", err)
}
if msg_id == nil {
return nil, nhttp.NewError("nil message id")
return "", nhttp.NewError("nil message id")
}
return &createdMessage{
URI: config.MakeURLNidus("/message/%s", strconv.Itoa(int(*msg_id))),
}, nil
return fmt.Sprintf("/message/%d", *msg_id), nil
}

View file

@ -1,25 +0,0 @@
package api
type queryParams struct {
Limit *int `schema:"limit"`
Sort *string `schema:"sort"`
Type *string `schema:"type"`
}
func (qp queryParams) SortOrDefault(default_name string, ascending bool) (string, bool) {
if qp.Sort == nil {
return default_name, ascending
}
s := *qp.Sort
if s == "" {
return default_name, ascending
}
a := true
if s[0] == '-' {
a = false
}
if s[0] == '+' || s[0] == '-' {
s = s[1:]
}
return s, a
}

View file

@ -2,147 +2,28 @@ package api
import (
"context"
"errors"
"fmt"
"net/http"
"time"
"github.com/Gleipnir-Technology/bob"
"github.com/Gleipnir-Technology/bob/dialect/psql"
"github.com/Gleipnir-Technology/bob/dialect/psql/um"
"github.com/Gleipnir-Technology/nidus-sync/db"
"github.com/Gleipnir-Technology/nidus-sync/db/enums"
"github.com/Gleipnir-Technology/nidus-sync/db/models"
nhttp "github.com/Gleipnir-Technology/nidus-sync/http"
"github.com/Gleipnir-Technology/nidus-sync/platform"
"github.com/aarondl/opt/omit"
"github.com/aarondl/opt/omitnull"
"github.com/rs/zerolog/log"
/*
"github.com/Gleipnir-Technology/bob/dialect/psql/sm"
"github.com/Gleipnir-Technology/nidus-sync/platform/geom"
"github.com/aarondl/opt/omit"
"github.com/aarondl/opt/omitnull"
"github.com/stephenafamo/scan"
*/)
)
type reviewPoolUpdate struct {
Condition *string `json:"condition"`
Latitude *float32 `json:"latitude"`
Longitude *float32 `json:"longitude"`
}
type createReviewPool struct {
Status string `json:"status"`
TaskID int32 `json:"task_id"`
Updates *reviewPoolUpdate `json:"updates"`
Status string `json:"status"`
TaskID int32 `json:"task_id"`
Updates *platform.PoolUpdate `json:"updates"`
}
type createdReviewPool struct{}
func postReviewPool(ctx context.Context, r *http.Request, user platform.User, req createReviewPool) (*createdReviewPool, *nhttp.ErrorWithStatus) {
txn, err := db.PGInstance.BobDB.BeginTx(ctx, nil)
func postReviewPool(ctx context.Context, r *http.Request, user platform.User, req createReviewPool) (string, *nhttp.ErrorWithStatus) {
id, err := platform.ReviewPoolCreate(ctx, user, req.TaskID, req.Status, req.Updates)
if err != nil {
return nil, nhttp.NewError("start txn: %w", err)
if errors.As(err, &platform.ErrorNotFound{}) {
return "", nhttp.NewErrorStatus(http.StatusNotFound, "review task %d not found", req.TaskID)
}
return "", nhttp.NewError("failed to set review: %w", err)
}
defer txn.Rollback(ctx)
review_task, err := models.ReviewTasks.Query(
models.SelectWhere.ReviewTasks.ID.EQ(req.TaskID),
models.SelectWhere.ReviewTasks.OrganizationID.EQ(user.Organization.ID()),
).One(ctx, txn)
if err != nil {
return nil, nhttp.NewErrorStatus(http.StatusNotFound, "review task %d not found", req.TaskID)
}
var resolution enums.Reviewtaskresolutiontype
err = resolution.Scan(req.Status)
if err != nil {
return nil, nhttp.NewErrorStatus(http.StatusNotFound, "status '%s' is not recognized", req.Status)
}
review_task.Update(ctx, txn, &models.ReviewTaskSetter{
Resolution: omitnull.From(resolution),
Reviewed: omitnull.From(time.Now()),
ReviewerID: omitnull.From(int32(user.ID)),
})
review_task_pool, err := models.ReviewTaskPools.Query(
models.SelectWhere.ReviewTaskPools.ReviewTaskID.EQ(review_task.ID),
).One(ctx, txn)
var e *nhttp.ErrorWithStatus
switch req.Status {
case "discarded":
e = discardReviewPool(ctx, txn, user, req, review_task_pool)
case "committed":
e = commitReviewPool(ctx, txn, user, req, review_task_pool)
default:
return nil, nhttp.NewErrorStatus(http.StatusBadRequest, "unrecognized status %s", req.Status)
}
if e != nil {
return nil, e
}
txn.Commit(ctx)
log.Info().Int32("id", review_task.ID).Str("status", req.Status).Msg("committed")
return &createdReviewPool{}, e
}
func discardReviewPool(ctx context.Context, txn bob.Tx, user platform.User, req createReviewPool, review_task_pool *models.ReviewTaskPool) *nhttp.ErrorWithStatus {
return nil
}
func commitReviewPool(ctx context.Context, txn bob.Tx, user platform.User, req createReviewPool, review_task_pool *models.ReviewTaskPool) *nhttp.ErrorWithStatus {
if req.Updates == nil {
return nil
}
up := *req.Updates
feature_pool, err := models.FindFeaturePool(ctx, txn, review_task_pool.FeaturePoolID)
if err != nil {
return nhttp.NewError("find feature pool: %w", err)
}
if up.Condition != nil {
var condition enums.Poolconditiontype
err := condition.Scan(*up.Condition)
if err != nil {
return nhttp.NewErrorStatus(http.StatusBadRequest, "unrecognized condition %s", up.Condition)
}
err = review_task_pool.Update(ctx, txn, &models.ReviewTaskPoolSetter{
Condition: omitnull.From(condition),
})
if err != nil {
return nhttp.NewError("update rewiew task: %w", err)
}
err = feature_pool.Update(ctx, txn, &models.FeaturePoolSetter{
Condition: omit.From(condition),
})
if err != nil {
return nhttp.NewError("update feature_pool: %w", err)
}
}
if up.Latitude != nil || up.Longitude != nil {
if up.Latitude == nil || up.Longitude == nil {
return nhttp.NewErrorStatus(http.StatusBadRequest, "you have to specify lat and lng together")
}
_, err = psql.Update(
um.Table("review_task_pool"),
um.SetCol("location").To(
psql.F("ST_SetSRID",
psql.F("ST_MakePoint",
psql.Arg(*up.Longitude),
psql.Arg(*up.Latitude),
), psql.Arg(4326),
),
),
um.Where(psql.Quote("review_task_pool", "review_task_id").EQ(psql.Arg(review_task_pool.ReviewTaskID))),
).Exec(ctx, txn)
if err != nil {
return nhttp.NewError("save task: %w", err)
}
_, err = psql.Update(
um.Table("feature"),
um.SetCol("location").To(
psql.F("ST_SetSRID",
psql.F("ST_MakePoint",
psql.Arg(*up.Longitude),
psql.Arg(*up.Latitude),
), psql.Arg(4326),
),
),
um.Where(psql.Quote("feature", "id").EQ(psql.Arg(review_task_pool.FeaturePoolID))),
).Exec(ctx, txn)
if err != nil {
return nhttp.NewError("save feature: %w", err)
}
}
return nil
return fmt.Sprintf("/review/%d", id), nil
}

View file

@ -1,49 +1,169 @@
package api
import (
"github.com/go-chi/chi/v5"
"github.com/go-chi/render"
"github.com/Gleipnir-Technology/nidus-sync/auth"
"github.com/Gleipnir-Technology/nidus-sync/platform/file"
"github.com/Gleipnir-Technology/nidus-sync/resource"
"github.com/gorilla/mux"
)
func AddRoutes(r chi.Router) {
func AddRoutesRMO(r *mux.Router) {
router := resource.NewRouter(r)
compliance_request := resource.ComplianceRequest(router)
district := resource.District(router)
geocode := resource.Geocode(router)
nuisance := resource.Nuisance(router)
pr_compliance := resource.PublicReportCompliance(router)
publicreport := resource.Publicreport(router)
publicreport_notification := resource.PublicreportNotification(router)
qrcode := resource.QRCode(router)
water := resource.Water(router)
r.HandleFunc("", handlerJSON(getRoot))
r.HandleFunc("/compliance-request/image/pool/{public_id}", compliance_request.ImagePoolGet).Methods("GET").Name("compliance-request.image.pool.ByIDGet")
r.Handle("/district", handlerJSONSlice(district.List)).Methods("GET")
r.Handle("/district/{id}", handlerJSON(district.GetByID)).Methods("GET").Name("district.ByIDGet")
r.HandleFunc("/district/{slug}/logo", apiGetDistrictLogo).Methods("GET").Name("district.logo.BySlug")
r.Handle("/geocode/by-gid/{id:.*}", handlerJSON(geocode.ByGID)).Methods("GET")
r.Handle("/geocode/reverse", handlerJSONPost(geocode.Reverse)).Methods("POST")
r.Handle("/geocode/reverse/closest", handlerJSONPost(geocode.ReverseClosest)).Methods("POST")
r.Handle("/geocode/suggestion", handlerJSONSlice(geocode.SuggestionList)).Methods("GET")
r.Handle("/publicreport-notification", handlerJSONPost(publicreport_notification.Create)).Methods("POST")
r.Handle("/qr-code/mailer/{code}", handlerBasic(qrcode.Mailer)).Methods("GET")
r.Handle("/qr-code/marketing", handlerBasic(qrcode.Marketing)).Methods("GET")
r.Handle("/qr-code/report/{code}", handlerBasic(qrcode.Report)).Methods("GET")
r.HandleFunc("/rmo/compliance", handlerJSONPost(pr_compliance.Create)).Methods("POST")
r.HandleFunc("/rmo/nuisance", handlerFormPost(nuisance.Create)).Methods("POST")
r.Handle("/rmo/publicreport/{id}", handlerBasic(publicreport.ByIDPublic)).Methods("GET").Name("publicreport.ByIDGetPublic")
r.Handle("/rmo/publicreport/compliance/{id}/image", handlerFormPost(publicreport.ImageCreate)).Methods("POST")
r.Handle("/rmo/publicreport/compliance/{id}", handlerJSON(pr_compliance.ByIDPublic)).Methods("GET").Name("publicreport.compliance.ByIDGetPublic")
r.Handle("/rmo/publicreport/compliance/{id}", handlerJSONPut(pr_compliance.Update)).Methods("PUT")
r.Handle("/rmo/publicreport/nuisance/{id}", handlerJSON(nuisance.ByIDPublic)).Methods("GET").Name("publicreport.nuisance.ByIDGetPublic")
r.Handle("/rmo/publicreport/water/{id}", handlerJSON(water.ByIDPublic)).Methods("GET").Name("publicreport.water.ByIDGetPublic")
r.Handle("/rmo/publicreport/{id}", handlerBasic(publicreport.ByIDPublic)).Methods("GET").Name("publicreport.ByIDGetPublicPublic")
r.HandleFunc("/rmo/water", handlerFormPost(water.Create)).Methods("POST")
}
func AddRoutesSync(r *mux.Router) {
router := resource.NewRouter(r)
compliance_request := resource.ComplianceRequest(router)
district := resource.District(router)
geocode := resource.Geocode(router)
lob_hook := resource.LobHook(router)
nuisance := resource.Nuisance(router)
pr_compliance := resource.PublicReportCompliance(router)
publicreport := resource.Publicreport(router)
publicreport_notification := resource.PublicreportNotification(router)
qrcode := resource.QRCode(router)
service_request := resource.ServiceRequest(router)
water := resource.Water(router)
//r.Use(render.SetContentType(render.ContentTypeJSON))
// Unauthenticated endpoints
r.HandleFunc("", handlerJSON(getRoot))
r.HandleFunc("/compliance-request/image/pool/{public_id}", compliance_request.ImagePoolGet).Methods("GET").Name("compliance-request.image.pool.ByIDGet")
r.Handle("/district", handlerJSONSlice(district.List)).Methods("GET")
r.Handle("/district/{id}", handlerJSON(district.GetByID)).Methods("GET").Name("district.ByIDGet")
r.HandleFunc("/district/{slug}/logo", apiGetDistrictLogo).Methods("GET").Name("district.logo.BySlug")
r.Handle("/geocode/by-gid/{id:.*}", handlerJSON(geocode.ByGID)).Methods("GET")
r.Handle("/geocode/reverse", handlerJSONPost(geocode.Reverse)).Methods("POST")
r.Handle("/geocode/reverse/closest", handlerJSONPost(geocode.ReverseClosest)).Methods("POST")
r.Handle("/geocode/suggestion", handlerJSONSlice(geocode.SuggestionList)).Methods("GET")
r.Handle("/lob/event", handlerBasic(lob_hook.Event)).Methods("POST")
r.Handle("/publicreport-notification", handlerJSONPost(publicreport_notification.Create)).Methods("POST")
r.Handle("/qr-code/mailer/{code}", handlerBasic(qrcode.Mailer)).Methods("GET")
r.Handle("/qr-code/marketing", handlerBasic(qrcode.Marketing)).Methods("GET")
r.Handle("/qr-code/report/{code}", handlerBasic(qrcode.Report)).Methods("GET")
r.HandleFunc("/signin", handlerJSONPost(postSignin))
r.Handle("/signout", authenticatedHandlerBasic(postSignout))
r.HandleFunc("/signup", handlerJSONPost(postSignup))
r.HandleFunc("/twilio/call", twilioCallPost).Methods("POST")
r.HandleFunc("/twilio/call/status", twilioCallStatusPost).Methods("POST")
r.HandleFunc("/twilio/message", twilioMessagePost).Methods("POST")
r.HandleFunc("/twilio/text", twilioTextPost).Methods("POST")
r.HandleFunc("/twilio/text/status", twilioTextStatusPost).Methods("POST")
r.HandleFunc("/voipms/text", voipmsTextGet).Methods("GET")
r.HandleFunc("/voipms/text", voipmsTextPost).Methods("POST")
r.HandleFunc("/webhook/fieldseeker", webhookFieldseeker).Methods("GET")
r.HandleFunc("/webhook/fieldseeker", webhookFieldseeker).Methods("POST")
// Authenticated endpoints
r.Use(render.SetContentType(render.ContentTypeJSON))
r.Method("POST", "/audio/{uuid}", auth.NewEnsureAuth(apiAudioPost))
r.Method("POST", "/audio/{uuid}/content", auth.NewEnsureAuth(apiAudioContentPost))
r.Method("GET", "/client/ios", auth.NewEnsureAuth(handleClientIos))
r.Method("GET", "/communication", authenticatedHandlerJSON(listCommunication))
r.Method("GET", "/events", auth.NewEnsureAuth(streamEvents))
r.Method("POST", "/image/{uuid}", auth.NewEnsureAuth(apiImagePost))
r.Method("GET", "/image/{uuid}/content", auth.NewEnsureAuth(apiImageContentGet))
r.Method("POST", "/image/{uuid}/content", auth.NewEnsureAuth(apiImageContentPost))
r.Method("GET", "/leads", authenticatedHandlerJSON(listLead))
r.Method("POST", "/leads", authenticatedHandlerJSONPost(postLeads))
r.Method("GET", "/mosquito-source", auth.NewEnsureAuth(apiMosquitoSource))
r.Method("POST", "/publicreport/invalid", authenticatedHandlerJSONPost(postPublicreportInvalid))
r.Method("POST", "/publicreport/lead", authenticatedHandlerJSONPost(postPublicreportLead))
r.Method("POST", "/publicreport/message", authenticatedHandlerJSONPost(postPublicreportMessage))
r.Method("POST", "/review/pool", authenticatedHandlerJSONPost(postReviewPool))
r.Method("GET", "/review-task/pool", authenticatedHandlerJSON(listReviewTaskPool))
r.Method("GET", "/service-request", auth.NewEnsureAuth(apiServiceRequest))
r.Method("GET", "/signal", authenticatedHandlerJSON(listSignal))
r.Method("GET", "/trap-data", auth.NewEnsureAuth(apiTrapData))
r.Method("GET", "/tile/{z}/{y}/{x}", auth.NewEnsureAuth(getTile))
r.Method("GET", "/user", authenticatedHandlerJSON(getUser))
r.Handle("/audio/{uuid}", auth.NewEnsureAuth(apiAudioPost)).Methods("POST")
r.Handle("/audio/{uuid}/content", auth.NewEnsureAuth(apiAudioContentPost)).Methods("POST")
avatar := resource.Avatar(router)
r.Handle("/avatar/{uuid}", authenticatedHandlerGetImage(avatar.ByUUIDGet)).Methods("GET").Name("avatar.ByUUIDGet")
r.Handle("/avatar", authenticatedHandlerPostMultipart(avatar.Create, file.CollectionAvatar)).Methods("POST")
r.Handle("/client/ios", auth.NewEnsureAuth(handleClientIos)).Methods("GET")
communication := resource.Communication(router)
r.Handle("/communication", authenticatedHandlerJSONSlice(communication.List)).Methods("GET")
r.Handle("/communication/{id}", authenticatedHandlerJSON(communication.Get)).Methods("GET").Name("communication.ByIDGet")
r.Handle("/communication/{id}/mark/invalid", authenticatedHandlerJSONPost(communication.MarkInvalid)).Methods("POST").Name("communication.MarkInvalid")
r.Handle("/communication/{id}/mark/pending-response", authenticatedHandlerJSONPost(communication.MarkPendingResponse)).Methods("POST").Name("communication.MarkPendingResponse")
r.Handle("/communication/{id}/mark/possible-issue", authenticatedHandlerJSONPost(communication.MarkPossibleIssue)).Methods("POST").Name("communication.MarkPossibleIssue")
r.Handle("/communication/{id}/mark/possible-resolved", authenticatedHandlerJSONPost(communication.MarkPossibleResolved)).Methods("POST").Name("communication.MarkPossibleResolved")
r.Handle("/compliance-request/mailer", authenticatedHandlerJSONPost(compliance_request.CreateMailer)).Methods("POST")
//r.HandleFunc("/compliance-request/image/pool/{public_id}", getComplianceRequestImagePool).Methods("GET")
r.Handle("/configuration/integration/arcgis", authenticatedHandlerJSONPost(postConfigurationIntegrationArcgis)).Methods("POST")
r.Handle("/events", auth.NewEnsureAuth(streamEvents)).Methods("GET")
r.Handle("/image/{uuid}", auth.NewEnsureAuth(apiImagePost)).Methods("POST")
r.Handle("/image/{uuid}/content", auth.NewEnsureAuth(apiImageContentGet)).Methods("GET")
r.Handle("/image/{uuid}/content", auth.NewEnsureAuth(apiImageContentPost)).Methods("POST")
impersonation := resource.Impersonation(router)
r.Handle("/impersonation", authenticatedHandlerJSONPost(impersonation.Create)).Methods("POST")
r.Handle("/impersonation", authenticatedHandlerDelete(impersonation.Delete)).Methods("DELETE")
lead := resource.Lead(r)
r.Handle("/leads", authenticatedHandlerJSON(lead.List)).Methods("GET")
r.Handle("/leads", authenticatedHandlerJSONPost(lead.Create)).Methods("POST")
mailer := resource.Mailer(router)
r.Handle("/mailer", authenticatedHandlerJSONSlice(mailer.List)).Methods("GET")
r.Handle("/mailer/{id}", authenticatedHandlerJSONPost(mailer.ByIDGet)).Methods("GET").Name("mailer.ByIDGet")
r.Handle("/mosquito-source", auth.NewEnsureAuth(apiMosquitoSource)).Methods("GET")
r.Handle("/publicreport/invalid", authenticatedHandlerJSONPost(postPublicreportInvalid)).Methods("POST")
r.Handle("/publicreport/signal", authenticatedHandlerJSONPost(postPublicreportSignal)).Methods("POST")
r.Handle("/publicreport/message", authenticatedHandlerJSONPost(postPublicreportMessage)).Methods("POST")
r.Handle("/publicreport/{id}", authenticatedHandlerBasic(publicreport.ByID)).Methods("GET").Name("publicreport.ByIDGet")
r.Handle("/publicreport/compliance/{id}", authenticatedHandlerJSON(pr_compliance.ByID)).Methods("GET").Name("publicreport.compliance.ByIDGet")
r.Handle("/publicreport/nuisance/{id}", authenticatedHandlerJSON(nuisance.ByID)).Methods("GET").Name("publicreport.nuisance.ByIDGet")
r.Handle("/publicreport/water/{id}", authenticatedHandlerJSON(water.ByID)).Methods("GET").Name("publicreport.water.ByIDGet")
r.Handle("/publicreport-notification", handlerJSONPost(publicreport_notification.Create)).Methods("POST")
r.Handle("/review/pool", authenticatedHandlerJSONPost(postReviewPool)).Methods("POST")
review_task := resource.ReviewTask(r)
r.Handle("/review-task", authenticatedHandlerJSON(review_task.List)).Methods("GET")
r.Handle("/service-request", authenticatedHandlerJSONSlice(service_request.List)).Methods("GET")
session := resource.Session(router)
r.Handle("/session", authenticatedHandlerJSON(session.Get)).Methods("GET").Name("session.get")
signal := resource.Signal(r)
r.Handle("/signal", authenticatedHandlerJSON(signal.List)).Methods("GET")
site := resource.Site(router)
r.Handle("/site", authenticatedHandlerJSONSlice(site.List)).Methods("GET")
r.Handle("/site/{id}", authenticatedHandlerJSON(site.ByIDGet)).Methods("GET").Name("site.ByIDGet")
sync := resource.Sync(r)
r.Handle("/sync", authenticatedHandlerJSONSlice(sync.List)).Methods("GET")
r.Handle("/sudo/email", authenticatedHandlerJSONPost(postSudoEmail)).Methods("POST")
r.Handle("/sudo/sms", authenticatedHandlerJSONPost(postSudoSMS)).Methods("POST")
r.Handle("/sudo/sse", authenticatedHandlerJSONPost(postSudoSSE)).Methods("POST")
r.Handle("/trap-data", auth.NewEnsureAuth(apiTrapData)).Methods("GET")
r.Handle("/tile/{z}/{y}/{x}", auth.NewEnsureAuth(getTile)).Methods("GET")
upload := resource.Upload(r)
r.Handle("/upload/pool/custom", authenticatedHandlerPostMultipart(upload.PoolCustomCreate, file.CollectionCSV)).Methods("POST")
r.Handle("/upload/pool/flyover", authenticatedHandlerPostMultipart(upload.PoolFlyoverCreate, file.CollectionCSV)).Methods("POST")
r.Handle("/upload", authenticatedHandlerJSON(upload.List)).Methods("GET")
r.Handle("/upload/{id}", authenticatedHandlerJSON(upload.ByIDGet)).Methods("GET")
r.Handle("/upload/{id}/commit", authenticatedHandlerJSONPost(upload.Commit)).Methods("POST")
r.Handle("/upload/{id}/discard", authenticatedHandlerJSONPost(upload.Discard)).Methods("POST")
user := resource.User(router)
r.Handle("/user/self", authenticatedHandlerJSON(user.SelfGet)).Methods("GET")
r.Handle("/user/suggestion", authenticatedHandlerJSON(user.SuggestionGet)).Methods("GET")
r.Handle("/user", authenticatedHandlerJSONSlice(user.List)).Methods("GET")
r.Handle("/user/{id}", authenticatedHandlerJSON(user.ByIDGet)).Methods("GET").Name("user.ByIDGet")
r.Handle("/user/{id}", authenticatedHandlerJSONPut(user.ByIDPut)).Methods("PUT")
// Unauthenticated endpoints
r.Get("/district", apiGetDistrict)
r.Get("/district/{slug}/logo", apiGetDistrictLogo)
r.Get("/compliance-request/image/pool/{public_id}", getComplianceRequestImagePool)
r.Post("/signin", postSignin)
r.Post("/twilio/call", twilioCallPost)
r.Post("/twilio/call/status", twilioCallStatusPost)
r.Post("/twilio/message", twilioMessagePost)
r.Post("/twilio/text", twilioTextPost)
r.Post("/twilio/text/status", twilioTextStatusPost)
r.Get("/voipms/text", voipmsTextGet)
r.Post("/voipms/text", voipmsTextPost)
r.Get("/webhook/fieldseeker", webhookFieldseeker)
r.Post("/webhook/fieldseeker", webhookFieldseeker)
}

View file

@ -1,144 +1 @@
package api
import (
"context"
"net/http"
"time"
"github.com/Gleipnir-Technology/bob"
"github.com/Gleipnir-Technology/bob/dialect/psql"
"github.com/Gleipnir-Technology/bob/dialect/psql/sm"
"github.com/Gleipnir-Technology/nidus-sync/db"
nhttp "github.com/Gleipnir-Technology/nidus-sync/http"
"github.com/Gleipnir-Technology/nidus-sync/platform"
"github.com/Gleipnir-Technology/nidus-sync/platform/types"
//"github.com/aarondl/opt/null"
"github.com/stephenafamo/scan"
)
type signal struct {
Address types.Address `json:"address"`
Addressed *time.Time `json:"addressed"`
Addressor *platform.User `json:"addressor"`
Created time.Time `json:"created"`
Creator platform.User `json:"creator"`
ID int32 `json:"id"`
Location types.Location `json:"location"`
Species string `json:"species"`
Title string `json:"title"`
Type string `json:"type"`
}
type contentListSignal struct {
Signals []signal `json:"signals"`
}
func listSignal(ctx context.Context, r *http.Request, user platform.User, query queryParams) (*contentListSignal, *nhttp.ErrorWithStatus) {
type _Row struct {
Address types.Address `db:"address"`
Addressed *time.Time `db:"addressed"`
Addressor *int32 `db:"addressor"`
Created time.Time `db:"created"`
Creator int32 `db:"creator_id"`
ID int32 `db:"id"`
Latitude float64 `db:"latitude"`
Longitude float64 `db:"longitude"`
Location types.Location `db:"location"`
Species *string `db:"species"`
Title string `db:"title"`
Type string `db:"type"`
}
limit := 20
if query.Limit != nil {
limit = *query.Limit
}
rows, err := bob.All(ctx, db.PGInstance.BobDB, psql.Select(
sm.Columns(
"signal.addressed AS addressed",
"signal.addressor AS addressor",
"signal.created AS created",
"signal.creator AS creator_id",
"signal.id AS id",
"signal.species AS species",
"signal.title AS title",
"signal.type_ AS type",
"address.country AS \"address.country\"",
"address.locality AS \"address.locality\"",
"address.number_ AS \"address.number\"",
"address.postal_code AS \"address.postal_code\"",
"address.region AS \"address.region\"",
"address.street AS \"address.street\"",
"address.unit AS \"address.unit\"",
"ST_Y(address.geom) AS latitude",
"ST_X(address.geom) AS longitude",
),
sm.From("signal"),
sm.InnerJoin("signal_pool").OnEQ(
psql.Quote("signal", "id"),
psql.Quote("signal_pool", "signal_id"),
),
sm.InnerJoin("pool").OnEQ(
psql.Quote("signal_pool", "pool_id"),
psql.Quote("pool", "id"),
),
sm.InnerJoin("site").On(
psql.Quote("pool", "site_id").EQ(psql.Quote("site", "id")),
),
sm.InnerJoin("address").OnEQ(
psql.Quote("site", "address_id"),
psql.Quote("address", "id"),
),
sm.Where(psql.Quote("signal", "organization_id").EQ(psql.Arg(user.Organization.ID()))),
sm.Where(psql.Quote("signal", "addressed").IsNull()),
sm.Limit(limit),
), scan.StructMapper[_Row]())
/*
rows, err := models.Signals.Query(
models.SelectWhere.Signals.OrganizationID.EQ(org.ID),
sm.OrderBy("created").Desc(),
).All(ctx, db.PGInstance.BobDB)
*/
if err != nil {
return nil, nhttp.NewError("failed to get signals: %w", err)
}
users_by_id, err := platform.UsersByOrg(ctx, user.Organization)
if err != nil {
return nil, nhttp.NewError("users by id: %w", err)
}
signals := make([]signal, len(rows))
for i, row := range rows {
var species string = ""
if row.Species != nil {
species = *row.Species
}
signals[i] = signal{
Address: row.Address,
Addressed: row.Addressed,
Addressor: userOrNil(users_by_id, row.Addressor),
Created: row.Created,
Creator: *users_by_id[row.Creator],
ID: row.ID,
Location: types.Location{
Latitude: row.Latitude,
Longitude: row.Longitude,
},
Species: species,
Title: row.Title,
Type: row.Type,
}
}
return &contentListSignal{
Signals: signals,
}, nil
}
func userOrNil(usersByID map[int32]*platform.User, id *int32) *platform.User {
if id == nil {
return nil
}
u, ok := usersByID[*id]
if !ok {
return nil
}
return u
}

View file

@ -1,46 +1,46 @@
package api
import (
"context"
"errors"
"fmt"
"net/http"
"github.com/Gleipnir-Technology/nidus-sync/auth"
"github.com/go-chi/render"
nhttp "github.com/Gleipnir-Technology/nidus-sync/http"
"github.com/Gleipnir-Technology/nidus-sync/platform"
"github.com/rs/zerolog/log"
)
func postSignin(w http.ResponseWriter, r *http.Request) {
if err := r.ParseForm(); err != nil {
render.Render(w, r, errRender(fmt.Errorf("Failed to parse POST form: %w", err)))
return
}
type reqSignin struct {
Password string `schema:"password"`
Username string `schema:"username"`
}
username := r.FormValue("username")
password := r.FormValue("password")
if password == "" || username == "" {
w.Header().Set("WWW-Authenticate-Error", "no-credentials")
http.Error(w, "invalid-credentials", http.StatusUnauthorized)
return
func postSignin(ctx context.Context, r *http.Request, req reqSignin) (string, *nhttp.ErrorWithStatus) {
if req.Password == "" {
return "", nhttp.NewBadRequest("Empty password")
}
log.Info().Str("username", username).Msg("API Signin")
_, err := auth.SigninUser(r, username, password)
if req.Username == "" {
return "", nhttp.NewBadRequest("Empty username")
}
log.Info().Str("username", req.Username).Msg("API Signin")
_, err := auth.SigninUser(r, req.Username, req.Password)
if err != nil {
if errors.Is(err, auth.InvalidCredentials{}) {
w.Header().Set("WWW-Authenticate-Error", "invalid-credentials")
http.Error(w, "invalid-credentials", http.StatusUnauthorized)
return
return "", nhttp.NewUnauthorized("invalid credentials")
}
if errors.Is(err, auth.InvalidUsername{}) {
w.Header().Set("WWW-Authenticate-Error", "invalid-credentials")
http.Error(w, "invalid-credentials", http.StatusUnauthorized)
return
return "", nhttp.NewUnauthorized("invalid credentials")
}
log.Error().Err(err).Str("username", username).Msg("Login server error")
http.Error(w, "signin-server-error", http.StatusInternalServerError)
return
if errors.Is(err, platform.NoUserError{}) {
return "", nhttp.NewUnauthorized("invalid credentials")
}
log.Error().Err(err).Str("username", req.Username).Msg("Login server error")
return "", nhttp.NewError("login server error")
}
http.Error(w, "", http.StatusAccepted)
return "/", nil
}
func postSignout(ctx context.Context, w http.ResponseWriter, r *http.Request, u platform.User) *nhttp.ErrorWithStatus {
auth.SignoutUser(r, u)
return nil
}

37
api/signup.go Normal file
View file

@ -0,0 +1,37 @@
package api
import (
"context"
"net/http"
"strings"
"github.com/Gleipnir-Technology/nidus-sync/auth"
nhttp "github.com/Gleipnir-Technology/nidus-sync/http"
"github.com/rs/zerolog/log"
)
type reqSignup struct {
Username string `json:"username"`
Name string `json:"name"`
Password string `json:"password"`
Terms bool `json:"terms"`
}
func postSignup(ctx context.Context, r *http.Request, signup reqSignup) (string, *nhttp.ErrorWithStatus) {
log.Info().Str("username", signup.Username).Str("name", signup.Name).Str("password", strings.Repeat("*", len(signup.Password))).Msg("Signup")
if !signup.Terms {
log.Warn().Msg("Terms not agreed")
return "", nhttp.NewErrorStatus(http.StatusBadRequest, "You must agree to the terms to register")
}
user, err := auth.SignupUser(r.Context(), signup.Username, signup.Name, signup.Password)
if err != nil {
return "", nhttp.NewError("Failed to signup user", err)
}
auth.AddUserSession(ctx, user)
return "/", nil
}

104
api/sudo.go Normal file
View file

@ -0,0 +1,104 @@
package api
import (
"context"
"fmt"
"net/http"
"github.com/Gleipnir-Technology/nidus-sync/comms/email"
"github.com/Gleipnir-Technology/nidus-sync/comms/text"
"github.com/Gleipnir-Technology/nidus-sync/config"
"github.com/Gleipnir-Technology/nidus-sync/html"
nhttp "github.com/Gleipnir-Technology/nidus-sync/http"
"github.com/Gleipnir-Technology/nidus-sync/platform"
"github.com/rs/zerolog/log"
)
type contentSudo struct {
ForwardEmailRMOAddress string
ForwardEmailNidusAddress string
}
func getSudo(ctx context.Context, r *http.Request, user platform.User) (*html.Response[contentSudo], *nhttp.ErrorWithStatus) {
if !user.HasRoot() {
return nil, &nhttp.ErrorWithStatus{
Message: "You have to be a root user to access this",
Status: http.StatusForbidden,
}
}
content := contentSudo{
ForwardEmailRMOAddress: config.ForwardEmailRMOAddress,
ForwardEmailNidusAddress: config.ForwardEmailNidusAddress,
}
return html.NewResponse("sync/sudo.html", content), nil
}
type FormEmail struct {
Body string `schema:"emailBody"`
From string `schema:"emailFrom"`
Subject string `schema:"emailSubject"`
To string `schema:"emailTo"`
}
func postSudoEmail(ctx context.Context, r *http.Request, u platform.User, e FormEmail) (string, *nhttp.ErrorWithStatus) {
if !u.HasRoot() {
return "", &nhttp.ErrorWithStatus{
Message: "You must have sudo powers to do this",
Status: http.StatusForbidden,
}
}
request := email.Request{
From: e.From,
HTML: fmt.Sprintf("<html><p>%s</p></html>", e.Body),
Sender: e.From,
Subject: e.Subject,
To: e.To,
Text: e.Body,
}
resp, err := email.Send(ctx, request)
if err != nil {
log.Warn().Err(err).Msg("Failed to send email")
} else {
log.Info().Str("id", resp.ID).Str("to", e.To).Msg("Sent Email")
}
return "/sudo", nil
}
type FormSMS struct {
Message string `schema:"smsMessage"`
Phone string `schema:"smsPhone"`
}
func postSudoSMS(ctx context.Context, r *http.Request, u platform.User, sms FormSMS) (string, *nhttp.ErrorWithStatus) {
if !u.HasRoot() {
return "", &nhttp.ErrorWithStatus{
Message: "You must have sudo powers to do this",
Status: http.StatusForbidden,
}
}
id, err := text.SendText(ctx, config.VoipMSNumber, sms.Phone, sms.Message)
if err != nil {
log.Warn().Err(err).Msg("Failed to send SMS")
} else {
log.Info().Str("id", id).Msg("Sent SMS")
}
return "/sudo", nil
}
type FormSSE struct {
OrganizationID int32 `schema:"organizationID"`
Resource string `schema:"resource"`
Type string `schema:"type"`
URIPath string `schema:"uriPath"`
}
func postSudoSSE(ctx context.Context, r *http.Request, u platform.User, sse FormSSE) (string, *nhttp.ErrorWithStatus) {
if !u.HasRoot() {
return "", &nhttp.ErrorWithStatus{
Message: "You must have sudo powers to do this",
Status: http.StatusForbidden,
}
}
platform.SudoEvent(sse.OrganizationID, sse.Resource, sse.Type, sse.URIPath)
return "/sudo", nil
}

View file

@ -5,14 +5,15 @@ import (
"strconv"
"github.com/Gleipnir-Technology/nidus-sync/platform"
"github.com/go-chi/chi/v5"
"github.com/rs/zerolog/log"
"github.com/gorilla/mux"
//"github.com/rs/zerolog/log"
)
func getTile(w http.ResponseWriter, r *http.Request, user platform.User) {
x_str := chi.URLParam(r, "x")
y_str := chi.URLParam(r, "y")
z_str := chi.URLParam(r, "z")
vars := mux.Vars(r)
x_str := vars["x"]
y_str := vars["y"]
z_str := vars["z"]
x, err := strconv.Atoi(x_str)
if err != nil {
@ -29,9 +30,8 @@ func getTile(w http.ResponseWriter, r *http.Request, user platform.User) {
http.Error(w, "can't parse x as an integer", http.StatusBadRequest)
return
}
err = platform.GetTile(r.Context(), w, user.Organization, uint(z), uint(y), uint(x))
err = platform.GetTile(r.Context(), w, user.Organization, true, uint(z), uint(y), uint(x))
if err != nil {
log.Error().Err(err).Msg("failed to do tile")
http.Error(w, "failed to do tile", http.StatusInternalServerError)
return
}

View file

@ -7,8 +7,9 @@ import (
"github.com/Gleipnir-Technology/nidus-sync/db/models"
"github.com/Gleipnir-Technology/nidus-sync/h3utils"
"github.com/Gleipnir-Technology/nidus-sync/platform"
"github.com/Gleipnir-Technology/nidus-sync/platform/types"
"github.com/aarondl/opt/null"
"github.com/go-chi/render"
//"github.com/gorilla/mux"
"github.com/rs/zerolog/log"
)
@ -91,11 +92,10 @@ type NoteAudioBreadcrumbPayload struct {
type ResponseFieldseeker struct {
MosquitoSources []ResponseMosquitoSource `json:"sources"`
ServiceRequests []ResponseServiceRequest `json:"requests"`
ServiceRequests []types.ServiceRequest `json:"requests"`
TrapData []ResponseTrapData `json:"traps"`
}
// ResponseErr renderer type for handling all sorts of errors.
type ResponseClientIos struct {
Fieldseeker ResponseFieldseeker `json:"fieldseeker"`
Since time.Time `json:"since"`
@ -105,23 +105,6 @@ func (i ResponseClientIos) Render(w http.ResponseWriter, r *http.Request) error
return nil
}
// In the best case scenario, the excellent github.com/pkg/errors package
// helps reveal information on the error, setting it on Err, and in the Render()
// method, using it to set the application-specific error code in AppCode.
type ResponseErr struct {
Error error `json:"-"` // low-level runtime error
HTTPStatusCode int `json:"-"` // http response status code
StatusText string `json:"status"` // user-level status message
AppCode int64 `json:"code,omitempty"` // application-specific error code
ErrorText string `json:"error,omitempty"` // application-level error message, for debugging
}
func (e *ResponseErr) Render(w http.ResponseWriter, r *http.Request) error {
render.Status(r, e.HTTPStatusCode)
return nil
}
type ResponseMosquitoInspection struct {
ActionTaken string `json:"action_taken"`
Comments string `json:"comments"`
@ -252,48 +235,10 @@ func (rtd ResponseNote) Render(w http.ResponseWriter, r *http.Request) error {
return nil
}
type ResponseServiceRequest struct {
Address string `json:"address"`
AssignedTechnician string `json:"assigned_technician"`
City string `json:"city"`
Created string `json:"created"`
H3Cell int64 `json:"h3cell"`
HasDog *bool `json:"has_dog"`
HasSpanishSpeaker *bool `json:"has_spanish_speaker"`
ID string `json:"id"`
Priority string `json:"priority"`
RecordedDate string `json:"recorded_date"`
Source string `json:"source"`
Status string `json:"status"`
Target string `json:"target"`
Zip string `json:"zip"`
}
func (srr ResponseServiceRequest) Render(w http.ResponseWriter, r *http.Request) error {
return nil
}
func NewResponseServiceRequest(sr *models.FieldseekerServicerequest) ResponseServiceRequest {
return ResponseServiceRequest{
Address: sr.Reqaddr1.GetOr(""),
AssignedTechnician: sr.Assignedtech.GetOr(""),
City: sr.Reqcity.GetOr(""),
Created: formatTime(sr.Creationdate),
//H3Cell: sr.H3Cell,
HasDog: toBool(sr.Dog),
HasSpanishSpeaker: toBool(sr.Spanish),
ID: sr.Globalid.String(),
Priority: sr.Priority.GetOr(""),
Status: sr.Status.GetOr(""),
Source: sr.Source.GetOr(""),
Target: sr.Reqtarget.GetOr(""),
Zip: sr.Reqzip.GetOr(""),
}
}
func NewResponseServiceRequests(requests models.FieldseekerServicerequestSlice) []ResponseServiceRequest {
results := make([]ResponseServiceRequest, 0)
func NewResponseServiceRequests(requests models.FieldseekerServicerequestSlice) []types.ServiceRequest {
results := make([]types.ServiceRequest, 0)
for _, i := range requests {
results = append(results, NewResponseServiceRequest(i))
results = append(results, types.ServiceRequestFromModel(i))
}
return results
}

1
api/upload.go Normal file
View file

@ -0,0 +1 @@
package api

View file

@ -1,18 +1 @@
package api
import (
"context"
"net/http"
nhttp "github.com/Gleipnir-Technology/nidus-sync/http"
"github.com/Gleipnir-Technology/nidus-sync/platform"
)
func getUser(ctx context.Context, r *http.Request, user platform.User, query queryParams) (*platform.User, *nhttp.ErrorWithStatus) {
counts, err := platform.NotificationCountsForUser(ctx, user)
if err != nil {
return nil, nhttp.NewError("get notifications: %w", err)
}
user.NotificationCounts = *counts
return &user, nil
}

@ -1 +1 @@
Subproject commit f5ec5c75c10bf711aa31ff0df56b445fbc2e208e
Subproject commit 63cc8b573739294ea98f7e39d2baec3cd70dfd7f

View file

@ -13,9 +13,9 @@ import (
"golang.org/x/crypto/bcrypt"
)
type NoCredentialsError struct{}
type InactiveUser struct{}
func (e NoCredentialsError) Error() string { return "No credentials were present in the request" }
func (e InactiveUser) Error() string { return "That user is not active" }
type InvalidCredentials struct{}
@ -25,21 +25,59 @@ type InvalidUsername struct{}
func (e InvalidUsername) Error() string { return "That username doesn't exist" }
type NoCredentialsError struct{}
func (e NoCredentialsError) Error() string { return "No credentials were present in the request" }
type AuthenticatedHandler func(http.ResponseWriter, *http.Request, platform.User)
type EnsureAuth struct {
handler AuthenticatedHandler
}
func AddUserSession(r *http.Request, user *platform.User) {
id := strconv.Itoa(int(user.ID))
sessionManager.Put(r.Context(), "user_id", id)
sessionManager.Put(r.Context(), "username", user.Username)
log.Debug().Str("id", id).Str("username", user.Username).Msg("added user session")
func AddUserSession(ctx context.Context, user *platform.User) {
id_str := strconv.Itoa(int(user.ID))
sessionManager.Put(ctx, "user_id", id_str)
sessionManager.Put(ctx, "username", user.Username)
log.Debug().Str("id", id_str).Str("username", user.Username).Msg("added user session")
}
func ImpersonateEnd(ctx context.Context) {
sessionManager.Put(ctx, "impersonated_user_id", "")
}
func ImpersonateUser(ctx context.Context, target_user_id int) {
target_user_id_str := strconv.Itoa(int(target_user_id))
sessionManager.Put(ctx, "impersonated_user_id", target_user_id_str)
}
func ImpersonatedUser(ctx context.Context) *int32 {
i_str := sessionManager.GetString(ctx, "impersonated_user_id")
if i_str == "" {
return nil
}
i, err := strconv.Atoi(i_str)
if err != nil {
log.Error().Err(err).Str("impersonated_user_id", i_str).Msg("failed to parse impersonated_user_id")
return nil
}
result := int32(i)
return &result
}
func ImpersonatorID(ctx context.Context) *int32 {
user_id_str := sessionManager.GetString(ctx, "user_id")
user_id, err := strconv.Atoi(user_id_str)
if err != nil {
log.Error().Err(err).Str("user_id", user_id_str).Msg("failed to parse user_id")
return nil
}
result := int32(user_id)
return &result
}
func GetAuthenticatedUser(r *http.Request) (*platform.User, error) {
ctx := r.Context()
user_id_str := sessionManager.GetString(ctx, "user_id")
impersonated_user_id_str := sessionManager.GetString(ctx, "impersonated_user_id")
if impersonated_user_id_str != "" {
user_id_str = impersonated_user_id_str
}
if user_id_str != "" {
user_id, err := strconv.Atoi(user_id_str)
if err != nil {
@ -47,7 +85,14 @@ func GetAuthenticatedUser(r *http.Request) (*platform.User, error) {
}
username := sessionManager.GetString(ctx, "username")
if user_id > 0 && username != "" {
return platform.UserByID(ctx, int32(user_id))
user, err := platform.UserByID(ctx, int32(user_id))
if err != nil {
return nil, fmt.Errorf("user by ID: %w", err)
}
if !user.IsActive {
return nil, fmt.Errorf("user is inactive")
}
return user, nil
}
}
// If we can't get the user from the session try to get from auth headers
@ -59,7 +104,7 @@ func GetAuthenticatedUser(r *http.Request) (*platform.User, error) {
if err != nil {
return nil, err
}
AddUserSession(r, user)
AddUserSession(ctx, user)
return user, nil
}
@ -69,33 +114,39 @@ func NewEnsureAuth(handlerToWrap AuthenticatedHandler) *EnsureAuth {
func (ea *EnsureAuth) ServeHTTP(w http.ResponseWriter, r *http.Request) {
// If this is an API request respond with a more machine-readable error state
accept := r.Header.Values("Accept")
offers := []string{"application/json", "text/html"}
accept := r.Header.Get("Accept")
/*
offers := []string{"application/json", "text/html"}
content_type := NegotiateContent(accept, offers)
content_type := NegotiateContent(accept, offers)
*/
user, err := GetAuthenticatedUser(r)
if err != nil || user == nil {
var msg []byte
// Separate return codes for different authentication failures
if _, ok := err.(*NoCredentialsError); ok {
log.Info().Msg("No credentials present and no session")
w.Header().Set("WWW-Authenticate-Error", "no-credentials")
msg = []byte("Please provide credentials.\n")
} else if _, ok := err.(*platform.NoUserError); ok {
w.Header().Set("WWW-Authenticate-Error", "invalid-credentials")
msg = []byte("Invalid credentials provided.\n")
} else if _, ok := err.(*InvalidCredentials); ok {
w.Header().Set("WWW-Authenticate-Error", "invalid-credentials")
msg = []byte("Invalid credentials provided.\n")
// Don't send authentication headers for browsers because it forces the authentication popup
requested_with := r.Header.Get("X-Requested-With")
//log.Debug().Str("x-requested-with", requested_with).Send()
if !strings.HasPrefix(requested_with, "nidus-web") && accept != "text/event-stream" {
w.Header().Set("WWW-Authenticate", `Basic realm="Nidus Sync"`)
// Separate return codes for different authentication failures
if _, ok := err.(*NoCredentialsError); ok {
log.Info().Msg("No credentials present and no session")
w.Header().Set("WWW-Authenticate-Error", "no-credentials")
msg = []byte("Please provide credentials.\n")
} else if _, ok := err.(*platform.NoUserError); ok {
w.Header().Set("WWW-Authenticate-Error", "invalid-credentials")
msg = []byte("Invalid credentials provided.\n")
} else if _, ok := err.(*InvalidCredentials); ok {
w.Header().Set("WWW-Authenticate-Error", "invalid-credentials")
msg = []byte("Invalid credentials provided.\n")
}
}
if content_type == "text/html" {
http.Redirect(w, r, "/signin?next="+r.URL.Path, http.StatusSeeOther)
return
}
w.Header().Set("WWW-Authenticate", `Basic realm="Nidus Sync"`)
w.WriteHeader(401)
w.Write(msg)
_, err = w.Write(msg)
if err != nil {
log.Error().Err(err).Msg("failed to write response")
}
return
}
ea.handler(w, r, *user)
@ -108,13 +159,17 @@ func SigninUser(r *http.Request, username string, password string) (*platform.Us
if user == nil {
return nil, errors.New("No matching user")
}
AddUserSession(r, user)
AddUserSession(r.Context(), user)
return user, nil
}
func SignoutUser(r *http.Request, user platform.User) {
sessionManager.Put(r.Context(), "user_id", "")
sessionManager.Put(r.Context(), "username", "")
err := sessionManager.Destroy(r.Context())
if err != nil {
log.Error().Err(err).Msg("failed to destroy session for user on signout")
}
log.Info().Str("username", user.Username).Int("user_id", (user.ID)).Msg("Ended user session")
}
@ -168,6 +223,9 @@ func validateUser(ctx context.Context, username string, password string) (*platf
log.Info().Str("username", username).Str("password", redact(password)).Msg("Invalid username")
return nil, InvalidUsername{}
}
if !user.IsActive {
return nil, InactiveUser{}
}
if !validatePassword(password, user.PasswordHash) {
log.Info().Str("username", username).Str("password", redact(password)).Str("hash", passwordHash).Msg("Invalid password for user")
return nil, InvalidCredentials{}

View file

@ -3,9 +3,9 @@ package auth
import (
"time"
"github.com/alexedwards/scs/v2"
"github.com/alexedwards/scs/pgxstore"
"github.com/Gleipnir-Technology/nidus-sync/db"
"github.com/alexedwards/scs/pgxstore"
"github.com/alexedwards/scs/v2"
)
var sessionManager *scs.SessionManager

View file

@ -26,7 +26,7 @@ func main() {
}
func scanValue(message string, result *string) {
fmt.Printf(message)
fmt.Print("%s", message)
scanner := bufio.NewScanner(os.Stdin)
if ok := scanner.Scan(); !ok {
log.Fatal(errors.New("Failed to scan input"))

53
cmd/test-jet/main.go Normal file
View file

@ -0,0 +1,53 @@
package main
import (
"context"
"log"
"os"
"github.com/Gleipnir-Technology/nidus-sync/config"
"github.com/Gleipnir-Technology/nidus-sync/db"
"github.com/Gleipnir-Technology/nidus-sync/db/query/public"
)
func main() {
err := config.Parse()
if err != nil {
log.Printf("failed on config: %v", err)
os.Exit(1)
}
ctx := context.TODO()
err = db.InitializeDatabase(ctx, config.PGDSN)
if err != nil {
log.Printf("failed on db: %v", err)
os.Exit(2)
}
txn, err := db.BeginTxn(ctx)
if err != nil {
log.Printf("failed on txn: %v", err)
os.Exit(3)
}
defer txn.Rollback(ctx)
log.Printf("doing address")
gid := "openaddresses:address:us/ca/tulare-addresses-county:0dc28458fd03e3fa"
address, err := public.AddressFromGID(ctx, txn, gid)
if err != nil {
log.Printf("failed on query: %v", err)
os.Exit(4)
}
//log.Printf("address %d lat %f lng %f", address.ID, *address.LocationLatitude, *address.LocationLongitude)
log.Printf("Address id %d location %s", address.ID, address.Location)
txn.Commit(ctx)
/*
log.Printf("doing comm")
id := int64(1)
comm, err := public.CommunicationFromID(ctx, id)
if err != nil {
log.Printf("failed on query: %v", err)
os.Exit(4)
}
log.Printf("communication %d", comm.ID)
*/
}

View file

@ -5,12 +5,13 @@ import (
"encoding/json"
"errors"
"fmt"
"io/ioutil"
"io"
"net/http"
"net/url"
"strconv"
"github.com/Gleipnir-Technology/nidus-sync/config"
"github.com/Gleipnir-Technology/nidus-sync/lint"
"github.com/rs/zerolog/log"
)
@ -87,10 +88,10 @@ func makeVoipMSRequest(params url.Values) (VoipMSResponse, error) {
log.Warn().Err(err).Str("url", full_url).Msg("Failed to make request to Voip.MS")
return result, fmt.Errorf("Error making request: %w", err)
}
defer resp.Body.Close()
defer lint.LogOnErr(resp.Body.Close, "failed closing response body")
// Read the response body
body, err := ioutil.ReadAll(resp.Body)
body, err := io.ReadAll(resp.Body)
if err != nil {
log.Warn().Err(err).Str("url", full_url).Msg("Failed to read Voip.MS response body")
return result, fmt.Errorf("Failed to read response: %w", err)

View file

@ -26,12 +26,14 @@ var (
ForwardEmailNidusAddress string
ForwardEmailNidusPassword string
ForwardEmailNidusUsername string
LobAPIKey string
PGDSN string
PhoneNumberReport phonenumbers.PhoneNumber
PhoneNumberReportStr string
PhoneNumberSupport phonenumbers.PhoneNumber
PhoneNumberSupportStr string
SentryDSN string
SentryDSNFrontend string
StadiaMapsAPIKey string
TextProvider string
TwilioAuthToken string
@ -96,7 +98,7 @@ func Parse() (err error) {
if Environment == "" {
return fmt.Errorf("You must specify a non-empty ENVIRONMENT")
}
if !(Environment == "PRODUCTION" || Environment == "DEVELOPMENT") {
if Environment != "PRODUCTION" && Environment != "DEVELOPMENT" {
return fmt.Errorf("ENVIRONMENT should be either DEVELOPMENT or PRODUCTION")
}
FieldseekerSchemaDirectory = os.Getenv("FIELDSEEKER_SCHEMA_DIRECTORY")
@ -135,6 +137,10 @@ func Parse() (err error) {
if ForwardEmailNidusPassword == "" {
return fmt.Errorf("You must specify a non-empty FORWARDEMAIL_NIDUS_PASSWORD")
}
LobAPIKey = os.Getenv("LOB_API_KEY")
if LobAPIKey == "" {
return fmt.Errorf("You must specify a non-empty LOB_API_KEY")
}
PGDSN = os.Getenv("POSTGRES_DSN")
if PGDSN == "" {
return fmt.Errorf("You must specify a non-empty POSTGRES_DSN")
@ -163,6 +169,10 @@ func Parse() (err error) {
if SentryDSN == "" {
return fmt.Errorf("You must specify a non-empty SENTRY_DSN")
}
SentryDSNFrontend = os.Getenv("SENTRY_DSN_FRONTEND")
if SentryDSNFrontend == "" {
return fmt.Errorf("You must specify a non-empty SENTRY_DSN_FRONTEND")
}
StadiaMapsAPIKey = os.Getenv("STADIA_MAPS_API_KEY")
if StadiaMapsAPIKey == "" {
return fmt.Errorf("You must specify a non-empty STADIA_MAPS_API_KEY")
@ -209,5 +219,5 @@ func Parse() (err error) {
}
func ArcGISOauthRedirectURL() string {
return MakeURLNidus("/arcgis/oauth/callback")
return MakeURLNidus("/oauth/arcgis/callback")
}

View file

@ -18,10 +18,10 @@ aliases:
no_tests: true
psql:
schemas:
- "arcgis"
- "comms"
- "fieldseeker"
- "fileupload"
- "lob"
- "public"
- "publicreport"
- "tile"

View file

@ -7,38 +7,148 @@ import (
"errors"
"fmt"
"io/fs"
"sync"
//"github.com/georgysavva/scany/v2/pgxscan"
//"github.com/jackc/pgx/v5"
"github.com/Gleipnir-Technology/bob"
"github.com/Gleipnir-Technology/jet/postgres"
"github.com/jackc/pgx/v5"
"github.com/jackc/pgx/v5/pgxpool"
"github.com/jackc/pgx/v5/stdlib"
_ "github.com/jackc/pgx/v5/stdlib"
"github.com/pressly/goose/v3"
"github.com/rs/zerolog/log"
"github.com/stephenafamo/scan"
pgxgeom "github.com/twpayne/pgx-geom"
)
var ErrNoRows = pgx.ErrNoRows
//go:embed migrations/*.sql
var embedMigrations embed.FS
type postgres struct {
type pginstance struct {
BobDB bob.DB
PGXPool *pgxpool.Pool
}
var (
PGInstance *postgres
pgOnce sync.Once
PGInstance *pginstance
)
func ExecuteNone(ctx context.Context, stmt postgres.Statement) error {
query, args := stmt.Sql()
_, err := PGInstance.PGXPool.Query(ctx, query, args...)
return err
}
func ExecuteNoneTx(ctx context.Context, txn Ex, stmt postgres.Statement) error {
query, args := stmt.Sql()
r, err := txn.Query(ctx, query, args...)
if err != nil {
return fmt.Errorf("query: %w", err)
}
r.Close()
return nil
}
func ExecuteNoneTxBob(ctx context.Context, txn bob.Tx, stmt postgres.Statement) error {
query, args := stmt.Sql()
r, err := txn.QueryContext(ctx, query, args...)
if err != nil {
return fmt.Errorf("query: %w", err)
}
r.Close()
return nil
}
func ExecuteOne[T any](ctx context.Context, stmt postgres.Statement) (T, error) {
query, args := stmt.Sql()
var result T
row, err := PGInstance.PGXPool.Query(ctx, query, args...)
if err != nil {
return result, fmt.Errorf("execute query: %w", err)
}
var collected *T
collected, err = pgx.CollectOneRow(row, pgx.RowToAddrOfStructByPos[T])
if err != nil || collected == nil {
return result, fmt.Errorf("collect row: %w", err)
}
return *collected, nil
}
func ExecuteOneTx[T any](ctx context.Context, txn Ex, stmt postgres.Statement) (T, error) {
query, args := stmt.Sql()
//result, err := scan.One(ctx, txn, scan.StructMapper[T](), query, args...)
row, err := txn.Query(ctx, query, args...)
var result T
if err != nil {
return result, fmt.Errorf("txn query: %w", err)
}
var collected *T
collected, err = pgx.CollectOneRow(row, pgx.RowToAddrOfStructByPos[T])
if err != nil || collected == nil {
return result, fmt.Errorf("collect row: %w", err)
}
return *collected, nil
}
func ExecuteOneTxBob[T any](ctx context.Context, txn bob.Tx, stmt postgres.Statement) (T, error) {
query, args := stmt.Sql()
return scan.One(ctx, txn, scan.StructMapper[T](), query, args...)
}
func ExecuteMany[T any](ctx context.Context, stmt postgres.Statement) ([]T, error) {
query, args := stmt.Sql()
rows, err := PGInstance.PGXPool.Query(ctx, query, args...)
if err != nil {
return nil, fmt.Errorf("execute query: %w", err)
}
collected, err := pgx.CollectRows(rows, pgx.RowToAddrOfStructByPos[T])
if err != nil {
return []T{}, fmt.Errorf("collect rows: %w", err)
}
results := make([]T, len(collected))
for i, c := range collected {
if c == nil {
return results, fmt.Errorf("null collected")
}
results[i] = *c
}
return results, nil
}
func ExecuteManyTx[T any](ctx context.Context, txn Ex, stmt postgres.Statement) ([]T, error) {
query, args := stmt.Sql()
rows, err := txn.Query(ctx, query, args...)
if err != nil {
return nil, fmt.Errorf("execute query: %w", err)
}
collected, err := pgx.CollectRows(rows, pgx.RowToAddrOfStructByPos[T])
if err != nil {
return []T{}, fmt.Errorf("collect rows: %w", err)
}
results := make([]T, len(collected))
for i, c := range collected {
if c == nil {
return results, fmt.Errorf("null collected")
}
results[i] = *c
}
return results, nil
}
func doMigrations(connection_string string) error {
log.Debug().Str("dsn", connection_string).Msg("Connecting to database")
db, err := sql.Open("pgx", connection_string)
if err != nil {
return fmt.Errorf("Failed to open database connection: %w", err)
}
defer db.Close()
defer func() {
err := db.Close()
if err != nil {
log.Error().Err(err).Msg("failed to close database connection")
}
}()
row := db.QueryRowContext(context.Background(), "SELECT version()")
var val string
if err := row.Scan(&val); err != nil {
@ -95,15 +205,23 @@ func InitializeDatabase(ctx context.Context, uri string) error {
log.Debug().Msg("No database migrations necessary")
}
pgOnce.Do(func() {
db, e := pgxpool.New(ctx, uri)
bobDB := bob.NewDB(stdlib.OpenDBFromPool(db))
PGInstance = &postgres{bobDB, db}
err = e
})
config, err := pgxpool.ParseConfig(uri)
if err != nil {
return fmt.Errorf("unable to create connection pool: %w", err)
return fmt.Errorf("parse config: %w", err)
}
config.AfterConnect = func(ctx2 context.Context, conn *pgx.Conn) error {
err2 := pgxgeom.Register(ctx, conn)
if err2 != nil {
return fmt.Errorf("pgxgeom register: %w", err2)
}
return nil
}
db, err := pgxpool.NewWithConfig(ctx, config)
if err != nil {
return fmt.Errorf("new pool: %w", err)
}
bobDB := bob.NewDB(stdlib.OpenDBFromPool(db))
PGInstance = &pginstance{bobDB, db}
var current string
query := `SELECT current_database()`
@ -111,10 +229,6 @@ func InitializeDatabase(ctx context.Context, uri string) error {
if err != nil {
return fmt.Errorf("Failed to get database current: %w", err)
}
err = prepareStatements(ctx)
if err != nil {
return fmt.Errorf("Failed to initialize prepared statements: %w", err)
}
return nil
}
@ -123,7 +237,12 @@ func needsMigrations(connection_string string) (*bool, error) {
if err != nil {
return nil, fmt.Errorf("Failed to open database connection: %w", err)
}
defer db.Close()
defer func() {
err := db.Close()
if err != nil {
log.Error().Err(err).Msg("failed to close database connection")
}
}()
row := db.QueryRowContext(context.Background(), "SELECT version()")
var val string
if err := row.Scan(&val); err != nil {

View file

@ -10,8 +10,17 @@ var AddressErrors = &addressErrors{
columns: []string{"id"},
s: "address_pkey",
},
ErrUniqueAddressGidUnique: &UniqueConstraintError{
schema: "",
table: "address",
columns: []string{"gid"},
s: "address_gid_unique",
},
}
type addressErrors struct {
ErrUniqueAddressPkey *UniqueConstraintError
ErrUniqueAddressGidUnique *UniqueConstraintError
}

View file

@ -1,17 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var ArcgisAccountErrors = &arcgisAccountErrors{
ErrUniqueAccountPkey: &UniqueConstraintError{
schema: "arcgis",
table: "account",
columns: []string{"id"},
s: "account_pkey",
},
}
type arcgisAccountErrors struct {
ErrUniqueAccountPkey *UniqueConstraintError
}

View file

@ -1,17 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var ArcgisAddressMappingErrors = &arcgisAddressMappingErrors{
ErrUniqueAddressMappingPkey: &UniqueConstraintError{
schema: "arcgis",
table: "address_mapping",
columns: []string{"organization_id", "destination"},
s: "address_mapping_pkey",
},
}
type arcgisAddressMappingErrors struct {
ErrUniqueAddressMappingPkey *UniqueConstraintError
}

View file

@ -1,17 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var ArcgisLayerErrors = &arcgisLayerErrors{
ErrUniqueLayerPkey: &UniqueConstraintError{
schema: "arcgis",
table: "layer",
columns: []string{"feature_service_item_id", "index_"},
s: "layer_pkey",
},
}
type arcgisLayerErrors struct {
ErrUniqueLayerPkey *UniqueConstraintError
}

View file

@ -1,17 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var ArcgisLayerFieldErrors = &arcgisLayerFieldErrors{
ErrUniqueLayerFieldPkey: &UniqueConstraintError{
schema: "arcgis",
table: "layer_field",
columns: []string{"layer_feature_service_item_id", "layer_index", "name"},
s: "layer_field_pkey",
},
}
type arcgisLayerFieldErrors struct {
ErrUniqueLayerFieldPkey *UniqueConstraintError
}

View file

@ -1,17 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var ArcgisOauthTokenErrors = &arcgisOauthTokenErrors{
ErrUniqueOauthTokenPkey: &UniqueConstraintError{
schema: "arcgis",
table: "oauth_token",
columns: []string{"id"},
s: "oauth_token_pkey",
},
}
type arcgisOauthTokenErrors struct {
ErrUniqueOauthTokenPkey *UniqueConstraintError
}

View file

@ -1,17 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var ArcgisParcelMappingErrors = &arcgisParcelMappingErrors{
ErrUniqueParcelMappingPkey: &UniqueConstraintError{
schema: "arcgis",
table: "parcel_mapping",
columns: []string{"organization_id", "destination"},
s: "parcel_mapping_pkey",
},
}
type arcgisParcelMappingErrors struct {
ErrUniqueParcelMappingPkey *UniqueConstraintError
}

View file

@ -1,17 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var ArcgisServiceFeatureErrors = &arcgisServiceFeatureErrors{
ErrUniqueFeatureServicePkey: &UniqueConstraintError{
schema: "arcgis",
table: "service_feature",
columns: []string{"item_id"},
s: "feature_service_pkey",
},
}
type arcgisServiceFeatureErrors struct {
ErrUniqueFeatureServicePkey *UniqueConstraintError
}

View file

@ -1,17 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var ArcgisServiceMapErrors = &arcgisServiceMapErrors{
ErrUniqueServiceMapPkey: &UniqueConstraintError{
schema: "arcgis",
table: "service_map",
columns: []string{"arcgis_id"},
s: "service_map_pkey",
},
}
type arcgisServiceMapErrors struct {
ErrUniqueServiceMapPkey *UniqueConstraintError
}

View file

@ -1,17 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var ArcgisUserErrors = &arcgisuserErrors{
ErrUniqueUser_Pkey: &UniqueConstraintError{
schema: "arcgis",
table: "user_",
columns: []string{"id"},
s: "user__pkey",
},
}
type arcgisuserErrors struct {
ErrUniqueUser_Pkey *UniqueConstraintError
}

View file

@ -1,17 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var ArcgisUserPrivilegeErrors = &arcgisUserPrivilegeErrors{
ErrUniqueUserPrivilegePkey: &UniqueConstraintError{
schema: "arcgis",
table: "user_privilege",
columns: []string{"user_id", "privilege"},
s: "user_privilege_pkey",
},
}
type arcgisUserPrivilegeErrors struct {
ErrUniqueUserPrivilegePkey *UniqueConstraintError
}

View file

@ -0,0 +1,17 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var CommunicationErrors = &communicationErrors{
ErrUniqueCommunicationPkey: &UniqueConstraintError{
schema: "",
table: "communication",
columns: []string{"id"},
s: "communication_pkey",
},
}
type communicationErrors struct {
ErrUniqueCommunicationPkey *UniqueConstraintError
}

View file

@ -0,0 +1,17 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var CommunicationLogEntryErrors = &communicationLogEntryErrors{
ErrUniqueCommunicationLogEntryPkey: &UniqueConstraintError{
schema: "",
table: "communication_log_entry",
columns: []string{"id"},
s: "communication_log_entry_pkey",
},
}
type communicationLogEntryErrors struct {
ErrUniqueCommunicationLogEntryPkey *UniqueConstraintError
}

View file

@ -4,6 +4,13 @@
package dberrors
var ComplianceReportRequestMailerErrors = &complianceReportRequestMailerErrors{
ErrUniqueComplianceReportRequestMailerPkey: &UniqueConstraintError{
schema: "",
table: "compliance_report_request_mailer",
columns: []string{"id"},
s: "compliance_report_request_mailer_pkey",
},
ErrUniqueComplianceReportRequestMaiComplianceReportRequestId_Key: &UniqueConstraintError{
schema: "",
table: "compliance_report_request_mailer",
@ -13,5 +20,7 @@ var ComplianceReportRequestMailerErrors = &complianceReportRequestMailerErrors{
}
type complianceReportRequestMailerErrors struct {
ErrUniqueComplianceReportRequestMailerPkey *UniqueConstraintError
ErrUniqueComplianceReportRequestMaiComplianceReportRequestId_Key *UniqueConstraintError
}

View file

@ -0,0 +1,17 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var LobEventErrors = &lobEventErrors{
ErrUniqueEventPkey: &UniqueConstraintError{
schema: "lob",
table: "event",
columns: []string{"id"},
s: "event_pkey",
},
}
type lobEventErrors struct {
ErrUniqueEventPkey *UniqueConstraintError
}

View file

@ -0,0 +1,17 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var LogImpersonationErrors = &logImpersonationErrors{
ErrUniqueLogImpersonationPkey: &UniqueConstraintError{
schema: "",
table: "log_impersonation",
columns: []string{"id"},
s: "log_impersonation_pkey",
},
}
type logImpersonationErrors struct {
ErrUniqueLogImpersonationPkey *UniqueConstraintError
}

View file

@ -10,8 +10,17 @@ var NoteImageErrors = &noteImageErrors{
columns: []string{"version", "uuid"},
s: "note_image_pkey",
},
ErrUniqueNoteImageIdUnique: &UniqueConstraintError{
schema: "",
table: "note_image",
columns: []string{"id"},
s: "note_image_id_unique",
},
}
type noteImageErrors struct {
ErrUniqueNoteImagePkey *UniqueConstraintError
ErrUniqueNoteImageIdUnique *UniqueConstraintError
}

View file

@ -0,0 +1,17 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var PublicreportClientErrors = &publicreportClientErrors{
ErrUniqueClientPkey: &UniqueConstraintError{
schema: "publicreport",
table: "client",
columns: []string{"uuid"},
s: "client_pkey",
},
}
type publicreportClientErrors struct {
ErrUniqueClientPkey *UniqueConstraintError
}

View file

@ -0,0 +1,17 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var PublicreportComplianceErrors = &publicreportComplianceErrors{
ErrUniqueCompliancePkey: &UniqueConstraintError{
schema: "publicreport",
table: "compliance",
columns: []string{"report_id"},
s: "compliance_pkey",
},
}
type publicreportComplianceErrors struct {
ErrUniqueCompliancePkey *UniqueConstraintError
}

View file

@ -7,7 +7,7 @@ var TileCachedImageErrors = &tileCachedImageErrors{
ErrUniqueCachedImagePkey: &UniqueConstraintError{
schema: "tile",
table: "cached_image",
columns: []string{"arcgis_id", "x", "y", "z"},
columns: []string{"service_id", "x", "y", "z"},
s: "cached_image_pkey",
},
}

View file

@ -0,0 +1,26 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dberrors
var TileServiceErrors = &tileServiceErrors{
ErrUniqueServicePkey: &UniqueConstraintError{
schema: "tile",
table: "service",
columns: []string{"id"},
s: "service_pkey",
},
ErrUniqueServiceNameUnique: &UniqueConstraintError{
schema: "tile",
table: "service",
columns: []string{"name"},
s: "service_name_unique",
},
}
type tileServiceErrors struct {
ErrUniqueServicePkey *UniqueConstraintError
ErrUniqueServiceNameUnique *UniqueConstraintError
}

View file

@ -17,7 +17,7 @@ var Addresses = Table[
Columns: addressColumns{
Country: column{
Name: "country",
DBType: "public.countrytype",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
@ -114,6 +114,15 @@ var Addresses = Table[
Generated: false,
AutoIncr: false,
},
Gid: column{
Name: "gid",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: addressIndexes{
AddressPkey: index{
@ -133,6 +142,23 @@ var Addresses = Table[
Where: "",
Include: []string{},
},
AddressGidUnique: index{
Type: "btree",
Name: "address_gid_unique",
Columns: []indexColumn{
{
Name: "gid",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
IdxAddressGeom: index{
Type: "gist",
Name: "idx_address_geom",
@ -157,6 +183,14 @@ var Addresses = Table[
Comment: "",
},
Uniques: addressUniques{
AddressGidUnique: constraint{
Name: "address_gid_unique",
Columns: []string{"gid"},
Comment: "",
},
},
Comment: "",
}
@ -172,22 +206,24 @@ type addressColumns struct {
Unit column
Region column
Number column
Gid column
}
func (c addressColumns) AsSlice() []column {
return []column{
c.Country, c.Created, c.Location, c.H3cell, c.ID, c.Locality, c.PostalCode, c.Street, c.Unit, c.Region, c.Number,
c.Country, c.Created, c.Location, c.H3cell, c.ID, c.Locality, c.PostalCode, c.Street, c.Unit, c.Region, c.Number, c.Gid,
}
}
type addressIndexes struct {
AddressPkey index
IdxAddressGeom index
AddressPkey index
AddressGidUnique index
IdxAddressGeom index
}
func (i addressIndexes) AsSlice() []index {
return []index{
i.AddressPkey, i.IdxAddressGeom,
i.AddressPkey, i.AddressGidUnique, i.IdxAddressGeom,
}
}
@ -197,10 +233,14 @@ func (f addressForeignKeys) AsSlice() []foreignKey {
return []foreignKey{}
}
type addressUniques struct{}
type addressUniques struct {
AddressGidUnique constraint
}
func (u addressUniques) AsSlice() []constraint {
return []constraint{}
return []constraint{
u.AddressGidUnique,
}
}
type addressChecks struct{}

View file

@ -1,177 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var ArcgisAccounts = Table[
arcgisAccountColumns,
arcgisAccountIndexes,
arcgisAccountForeignKeys,
arcgisAccountUniques,
arcgisAccountChecks,
]{
Schema: "arcgis",
Name: "account",
Columns: arcgisAccountColumns{
ID: column{
Name: "id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Name: column{
Name: "name",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
OrganizationID: column{
Name: "organization_id",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
URLFeatures: column{
Name: "url_features",
DBType: "text",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
URLInsights: column{
Name: "url_insights",
DBType: "text",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
URLGeometry: column{
Name: "url_geometry",
DBType: "text",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
URLNotebooks: column{
Name: "url_notebooks",
DBType: "text",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
URLTiles: column{
Name: "url_tiles",
DBType: "text",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
},
Indexes: arcgisAccountIndexes{
AccountPkey: index{
Type: "btree",
Name: "account_pkey",
Columns: []indexColumn{
{
Name: "id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "account_pkey",
Columns: []string{"id"},
Comment: "",
},
ForeignKeys: arcgisAccountForeignKeys{
ArcgisAccountAccountOrganizationIDFkey: foreignKey{
constraint: constraint{
Name: "arcgis.account.account_organization_id_fkey",
Columns: []string{"organization_id"},
Comment: "",
},
ForeignTable: "organization",
ForeignColumns: []string{"id"},
},
},
Comment: "",
}
type arcgisAccountColumns struct {
ID column
Name column
OrganizationID column
URLFeatures column
URLInsights column
URLGeometry column
URLNotebooks column
URLTiles column
}
func (c arcgisAccountColumns) AsSlice() []column {
return []column{
c.ID, c.Name, c.OrganizationID, c.URLFeatures, c.URLInsights, c.URLGeometry, c.URLNotebooks, c.URLTiles,
}
}
type arcgisAccountIndexes struct {
AccountPkey index
}
func (i arcgisAccountIndexes) AsSlice() []index {
return []index{
i.AccountPkey,
}
}
type arcgisAccountForeignKeys struct {
ArcgisAccountAccountOrganizationIDFkey foreignKey
}
func (f arcgisAccountForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.ArcgisAccountAccountOrganizationIDFkey,
}
}
type arcgisAccountUniques struct{}
func (u arcgisAccountUniques) AsSlice() []constraint {
return []constraint{}
}
type arcgisAccountChecks struct{}
func (c arcgisAccountChecks) AsSlice() []check {
return []check{}
}

View file

@ -1,162 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var ArcgisAddressMappings = Table[
arcgisAddressMappingColumns,
arcgisAddressMappingIndexes,
arcgisAddressMappingForeignKeys,
arcgisAddressMappingUniques,
arcgisAddressMappingChecks,
]{
Schema: "arcgis",
Name: "address_mapping",
Columns: arcgisAddressMappingColumns{
Destination: column{
Name: "destination",
DBType: "arcgis.mappingdestinationaddress",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
LayerFeatureServiceItemID: column{
Name: "layer_feature_service_item_id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
LayerIndex: column{
Name: "layer_index",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
LayerFieldName: column{
Name: "layer_field_name",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
OrganizationID: column{
Name: "organization_id",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: arcgisAddressMappingIndexes{
AddressMappingPkey: index{
Type: "btree",
Name: "address_mapping_pkey",
Columns: []indexColumn{
{
Name: "organization_id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
{
Name: "destination",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false, false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "address_mapping_pkey",
Columns: []string{"organization_id", "destination"},
Comment: "",
},
ForeignKeys: arcgisAddressMappingForeignKeys{
ArcgisAddressMappingAddressMappingLayerFeatureServiceItemIDLayerIndexFkey: foreignKey{
constraint: constraint{
Name: "arcgis.address_mapping.address_mapping_layer_feature_service_item_id_layer_index__fkey",
Columns: []string{"layer_feature_service_item_id", "layer_index", "layer_field_name"},
Comment: "",
},
ForeignTable: "arcgis.layer_field",
ForeignColumns: []string{"layer_feature_service_item_id", "layer_index", "name"},
},
ArcgisAddressMappingAddressMappingOrganizationIDFkey: foreignKey{
constraint: constraint{
Name: "arcgis.address_mapping.address_mapping_organization_id_fkey",
Columns: []string{"organization_id"},
Comment: "",
},
ForeignTable: "organization",
ForeignColumns: []string{"id"},
},
},
Comment: "",
}
type arcgisAddressMappingColumns struct {
Destination column
LayerFeatureServiceItemID column
LayerIndex column
LayerFieldName column
OrganizationID column
}
func (c arcgisAddressMappingColumns) AsSlice() []column {
return []column{
c.Destination, c.LayerFeatureServiceItemID, c.LayerIndex, c.LayerFieldName, c.OrganizationID,
}
}
type arcgisAddressMappingIndexes struct {
AddressMappingPkey index
}
func (i arcgisAddressMappingIndexes) AsSlice() []index {
return []index{
i.AddressMappingPkey,
}
}
type arcgisAddressMappingForeignKeys struct {
ArcgisAddressMappingAddressMappingLayerFeatureServiceItemIDLayerIndexFkey foreignKey
ArcgisAddressMappingAddressMappingOrganizationIDFkey foreignKey
}
func (f arcgisAddressMappingForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.ArcgisAddressMappingAddressMappingLayerFeatureServiceItemIDLayerIndexFkey, f.ArcgisAddressMappingAddressMappingOrganizationIDFkey,
}
}
type arcgisAddressMappingUniques struct{}
func (u arcgisAddressMappingUniques) AsSlice() []constraint {
return []constraint{}
}
type arcgisAddressMappingChecks struct{}
func (c arcgisAddressMappingChecks) AsSlice() []check {
return []check{}
}

View file

@ -1,132 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var ArcgisLayers = Table[
arcgisLayerColumns,
arcgisLayerIndexes,
arcgisLayerForeignKeys,
arcgisLayerUniques,
arcgisLayerChecks,
]{
Schema: "arcgis",
Name: "layer",
Columns: arcgisLayerColumns{
Extent: column{
Name: "extent",
DBType: "box2d",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
FeatureServiceItemID: column{
Name: "feature_service_item_id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Index: column{
Name: "index_",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: arcgisLayerIndexes{
LayerPkey: index{
Type: "btree",
Name: "layer_pkey",
Columns: []indexColumn{
{
Name: "feature_service_item_id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
{
Name: "index_",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false, false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "layer_pkey",
Columns: []string{"feature_service_item_id", "index_"},
Comment: "",
},
ForeignKeys: arcgisLayerForeignKeys{
ArcgisLayerLayerFeatureServiceItemIDFkey: foreignKey{
constraint: constraint{
Name: "arcgis.layer.layer_feature_service_item_id_fkey",
Columns: []string{"feature_service_item_id"},
Comment: "",
},
ForeignTable: "arcgis.service_feature",
ForeignColumns: []string{"item_id"},
},
},
Comment: "",
}
type arcgisLayerColumns struct {
Extent column
FeatureServiceItemID column
Index column
}
func (c arcgisLayerColumns) AsSlice() []column {
return []column{
c.Extent, c.FeatureServiceItemID, c.Index,
}
}
type arcgisLayerIndexes struct {
LayerPkey index
}
func (i arcgisLayerIndexes) AsSlice() []index {
return []index{
i.LayerPkey,
}
}
type arcgisLayerForeignKeys struct {
ArcgisLayerLayerFeatureServiceItemIDFkey foreignKey
}
func (f arcgisLayerForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.ArcgisLayerLayerFeatureServiceItemIDFkey,
}
}
type arcgisLayerUniques struct{}
func (u arcgisLayerUniques) AsSlice() []constraint {
return []constraint{}
}
type arcgisLayerChecks struct{}
func (c arcgisLayerChecks) AsSlice() []check {
return []check{}
}

View file

@ -1,147 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var ArcgisLayerFields = Table[
arcgisLayerFieldColumns,
arcgisLayerFieldIndexes,
arcgisLayerFieldForeignKeys,
arcgisLayerFieldUniques,
arcgisLayerFieldChecks,
]{
Schema: "arcgis",
Name: "layer_field",
Columns: arcgisLayerFieldColumns{
LayerFeatureServiceItemID: column{
Name: "layer_feature_service_item_id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
LayerIndex: column{
Name: "layer_index",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Name: column{
Name: "name",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Type: column{
Name: "type_",
DBType: "arcgis.fieldtype",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: arcgisLayerFieldIndexes{
LayerFieldPkey: index{
Type: "btree",
Name: "layer_field_pkey",
Columns: []indexColumn{
{
Name: "layer_feature_service_item_id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
{
Name: "layer_index",
Desc: null.FromCond(false, true),
IsExpression: false,
},
{
Name: "name",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false, false, false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "layer_field_pkey",
Columns: []string{"layer_feature_service_item_id", "layer_index", "name"},
Comment: "",
},
ForeignKeys: arcgisLayerFieldForeignKeys{
ArcgisLayerFieldLayerFieldLayerFeatureServiceItemIDLayerIndexFkey: foreignKey{
constraint: constraint{
Name: "arcgis.layer_field.layer_field_layer_feature_service_item_id_layer_index_fkey",
Columns: []string{"layer_feature_service_item_id", "layer_index"},
Comment: "",
},
ForeignTable: "arcgis.layer",
ForeignColumns: []string{"feature_service_item_id", "index_"},
},
},
Comment: "",
}
type arcgisLayerFieldColumns struct {
LayerFeatureServiceItemID column
LayerIndex column
Name column
Type column
}
func (c arcgisLayerFieldColumns) AsSlice() []column {
return []column{
c.LayerFeatureServiceItemID, c.LayerIndex, c.Name, c.Type,
}
}
type arcgisLayerFieldIndexes struct {
LayerFieldPkey index
}
func (i arcgisLayerFieldIndexes) AsSlice() []index {
return []index{
i.LayerFieldPkey,
}
}
type arcgisLayerFieldForeignKeys struct {
ArcgisLayerFieldLayerFieldLayerFeatureServiceItemIDLayerIndexFkey foreignKey
}
func (f arcgisLayerFieldForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.ArcgisLayerFieldLayerFieldLayerFeatureServiceItemIDLayerIndexFkey,
}
}
type arcgisLayerFieldUniques struct{}
func (u arcgisLayerFieldUniques) AsSlice() []constraint {
return []constraint{}
}
type arcgisLayerFieldChecks struct{}
func (c arcgisLayerFieldChecks) AsSlice() []check {
return []check{}
}

View file

@ -1,227 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var ArcgisOauthTokens = Table[
arcgisOauthTokenColumns,
arcgisOauthTokenIndexes,
arcgisOauthTokenForeignKeys,
arcgisOauthTokenUniques,
arcgisOauthTokenChecks,
]{
Schema: "arcgis",
Name: "oauth_token",
Columns: arcgisOauthTokenColumns{
AccessToken: column{
Name: "access_token",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
AccessTokenExpires: column{
Name: "access_token_expires",
DBType: "timestamp without time zone",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
ArcgisAccountID: column{
Name: "arcgis_account_id",
DBType: "text",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
ArcgisID: column{
Name: "arcgis_id",
DBType: "text",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
ArcgisLicenseTypeID: column{
Name: "arcgis_license_type_id",
DBType: "text",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
Created: column{
Name: "created",
DBType: "timestamp without time zone",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
ID: column{
Name: "id",
DBType: "integer",
Default: "nextval('arcgis.oauth_token_id_seq'::regclass)",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
InvalidatedAt: column{
Name: "invalidated_at",
DBType: "timestamp without time zone",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
RefreshToken: column{
Name: "refresh_token",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
RefreshTokenExpires: column{
Name: "refresh_token_expires",
DBType: "timestamp without time zone",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
UserID: column{
Name: "user_id",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Username: column{
Name: "username",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: arcgisOauthTokenIndexes{
OauthTokenPkey: index{
Type: "btree",
Name: "oauth_token_pkey",
Columns: []indexColumn{
{
Name: "id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "oauth_token_pkey",
Columns: []string{"id"},
Comment: "",
},
ForeignKeys: arcgisOauthTokenForeignKeys{
ArcgisOauthTokenOauthTokenArcgisAccountIDFkey: foreignKey{
constraint: constraint{
Name: "arcgis.oauth_token.oauth_token_arcgis_account_id_fkey",
Columns: []string{"arcgis_account_id"},
Comment: "",
},
ForeignTable: "arcgis.account",
ForeignColumns: []string{"id"},
},
ArcgisOauthTokenOauthTokenUserIDFkey: foreignKey{
constraint: constraint{
Name: "arcgis.oauth_token.oauth_token_user_id_fkey",
Columns: []string{"user_id"},
Comment: "",
},
ForeignTable: "user_",
ForeignColumns: []string{"id"},
},
},
Comment: "",
}
type arcgisOauthTokenColumns struct {
AccessToken column
AccessTokenExpires column
ArcgisAccountID column
ArcgisID column
ArcgisLicenseTypeID column
Created column
ID column
InvalidatedAt column
RefreshToken column
RefreshTokenExpires column
UserID column
Username column
}
func (c arcgisOauthTokenColumns) AsSlice() []column {
return []column{
c.AccessToken, c.AccessTokenExpires, c.ArcgisAccountID, c.ArcgisID, c.ArcgisLicenseTypeID, c.Created, c.ID, c.InvalidatedAt, c.RefreshToken, c.RefreshTokenExpires, c.UserID, c.Username,
}
}
type arcgisOauthTokenIndexes struct {
OauthTokenPkey index
}
func (i arcgisOauthTokenIndexes) AsSlice() []index {
return []index{
i.OauthTokenPkey,
}
}
type arcgisOauthTokenForeignKeys struct {
ArcgisOauthTokenOauthTokenArcgisAccountIDFkey foreignKey
ArcgisOauthTokenOauthTokenUserIDFkey foreignKey
}
func (f arcgisOauthTokenForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.ArcgisOauthTokenOauthTokenArcgisAccountIDFkey, f.ArcgisOauthTokenOauthTokenUserIDFkey,
}
}
type arcgisOauthTokenUniques struct{}
func (u arcgisOauthTokenUniques) AsSlice() []constraint {
return []constraint{}
}
type arcgisOauthTokenChecks struct{}
func (c arcgisOauthTokenChecks) AsSlice() []check {
return []check{}
}

View file

@ -1,162 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var ArcgisParcelMappings = Table[
arcgisParcelMappingColumns,
arcgisParcelMappingIndexes,
arcgisParcelMappingForeignKeys,
arcgisParcelMappingUniques,
arcgisParcelMappingChecks,
]{
Schema: "arcgis",
Name: "parcel_mapping",
Columns: arcgisParcelMappingColumns{
Destination: column{
Name: "destination",
DBType: "arcgis.mappingdestinationparcel",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
LayerFeatureServiceItemID: column{
Name: "layer_feature_service_item_id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
LayerIndex: column{
Name: "layer_index",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
LayerFieldName: column{
Name: "layer_field_name",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
OrganizationID: column{
Name: "organization_id",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: arcgisParcelMappingIndexes{
ParcelMappingPkey: index{
Type: "btree",
Name: "parcel_mapping_pkey",
Columns: []indexColumn{
{
Name: "organization_id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
{
Name: "destination",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false, false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "parcel_mapping_pkey",
Columns: []string{"organization_id", "destination"},
Comment: "",
},
ForeignKeys: arcgisParcelMappingForeignKeys{
ArcgisParcelMappingParcelMappingLayerFeatureServiceItemIDLayerIndexLFkey: foreignKey{
constraint: constraint{
Name: "arcgis.parcel_mapping.parcel_mapping_layer_feature_service_item_id_layer_index_l_fkey",
Columns: []string{"layer_feature_service_item_id", "layer_index", "layer_field_name"},
Comment: "",
},
ForeignTable: "arcgis.layer_field",
ForeignColumns: []string{"layer_feature_service_item_id", "layer_index", "name"},
},
ArcgisParcelMappingParcelMappingOrganizationIDFkey: foreignKey{
constraint: constraint{
Name: "arcgis.parcel_mapping.parcel_mapping_organization_id_fkey",
Columns: []string{"organization_id"},
Comment: "",
},
ForeignTable: "organization",
ForeignColumns: []string{"id"},
},
},
Comment: "",
}
type arcgisParcelMappingColumns struct {
Destination column
LayerFeatureServiceItemID column
LayerIndex column
LayerFieldName column
OrganizationID column
}
func (c arcgisParcelMappingColumns) AsSlice() []column {
return []column{
c.Destination, c.LayerFeatureServiceItemID, c.LayerIndex, c.LayerFieldName, c.OrganizationID,
}
}
type arcgisParcelMappingIndexes struct {
ParcelMappingPkey index
}
func (i arcgisParcelMappingIndexes) AsSlice() []index {
return []index{
i.ParcelMappingPkey,
}
}
type arcgisParcelMappingForeignKeys struct {
ArcgisParcelMappingParcelMappingLayerFeatureServiceItemIDLayerIndexLFkey foreignKey
ArcgisParcelMappingParcelMappingOrganizationIDFkey foreignKey
}
func (f arcgisParcelMappingForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.ArcgisParcelMappingParcelMappingLayerFeatureServiceItemIDLayerIndexLFkey, f.ArcgisParcelMappingParcelMappingOrganizationIDFkey,
}
}
type arcgisParcelMappingUniques struct{}
func (u arcgisParcelMappingUniques) AsSlice() []constraint {
return []constraint{}
}
type arcgisParcelMappingChecks struct{}
func (c arcgisParcelMappingChecks) AsSlice() []check {
return []check{}
}

View file

@ -1,147 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var ArcgisServiceFeatures = Table[
arcgisServiceFeatureColumns,
arcgisServiceFeatureIndexes,
arcgisServiceFeatureForeignKeys,
arcgisServiceFeatureUniques,
arcgisServiceFeatureChecks,
]{
Schema: "arcgis",
Name: "service_feature",
Columns: arcgisServiceFeatureColumns{
Extent: column{
Name: "extent",
DBType: "box2d",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
ItemID: column{
Name: "item_id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
SpatialReference: column{
Name: "spatial_reference",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
URL: column{
Name: "url",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
AccountID: column{
Name: "account_id",
DBType: "text",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
},
Indexes: arcgisServiceFeatureIndexes{
FeatureServicePkey: index{
Type: "btree",
Name: "feature_service_pkey",
Columns: []indexColumn{
{
Name: "item_id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "feature_service_pkey",
Columns: []string{"item_id"},
Comment: "",
},
ForeignKeys: arcgisServiceFeatureForeignKeys{
ArcgisServiceFeatureServiceFeatureAccountIDFkey: foreignKey{
constraint: constraint{
Name: "arcgis.service_feature.service_feature_account_id_fkey",
Columns: []string{"account_id"},
Comment: "",
},
ForeignTable: "arcgis.account",
ForeignColumns: []string{"id"},
},
},
Comment: "",
}
type arcgisServiceFeatureColumns struct {
Extent column
ItemID column
SpatialReference column
URL column
AccountID column
}
func (c arcgisServiceFeatureColumns) AsSlice() []column {
return []column{
c.Extent, c.ItemID, c.SpatialReference, c.URL, c.AccountID,
}
}
type arcgisServiceFeatureIndexes struct {
FeatureServicePkey index
}
func (i arcgisServiceFeatureIndexes) AsSlice() []index {
return []index{
i.FeatureServicePkey,
}
}
type arcgisServiceFeatureForeignKeys struct {
ArcgisServiceFeatureServiceFeatureAccountIDFkey foreignKey
}
func (f arcgisServiceFeatureForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.ArcgisServiceFeatureServiceFeatureAccountIDFkey,
}
}
type arcgisServiceFeatureUniques struct{}
func (u arcgisServiceFeatureUniques) AsSlice() []constraint {
return []constraint{}
}
type arcgisServiceFeatureChecks struct{}
func (c arcgisServiceFeatureChecks) AsSlice() []check {
return []check{}
}

View file

@ -1,147 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var ArcgisServiceMaps = Table[
arcgisServiceMapColumns,
arcgisServiceMapIndexes,
arcgisServiceMapForeignKeys,
arcgisServiceMapUniques,
arcgisServiceMapChecks,
]{
Schema: "arcgis",
Name: "service_map",
Columns: arcgisServiceMapColumns{
AccountID: column{
Name: "account_id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
ArcgisID: column{
Name: "arcgis_id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Name: column{
Name: "name",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Title: column{
Name: "title",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
URL: column{
Name: "url",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: arcgisServiceMapIndexes{
ServiceMapPkey: index{
Type: "btree",
Name: "service_map_pkey",
Columns: []indexColumn{
{
Name: "arcgis_id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "service_map_pkey",
Columns: []string{"arcgis_id"},
Comment: "",
},
ForeignKeys: arcgisServiceMapForeignKeys{
ArcgisServiceMapServiceMapAccountIDFkey: foreignKey{
constraint: constraint{
Name: "arcgis.service_map.service_map_account_id_fkey",
Columns: []string{"account_id"},
Comment: "",
},
ForeignTable: "arcgis.account",
ForeignColumns: []string{"id"},
},
},
Comment: "",
}
type arcgisServiceMapColumns struct {
AccountID column
ArcgisID column
Name column
Title column
URL column
}
func (c arcgisServiceMapColumns) AsSlice() []column {
return []column{
c.AccountID, c.ArcgisID, c.Name, c.Title, c.URL,
}
}
type arcgisServiceMapIndexes struct {
ServiceMapPkey index
}
func (i arcgisServiceMapIndexes) AsSlice() []index {
return []index{
i.ServiceMapPkey,
}
}
type arcgisServiceMapForeignKeys struct {
ArcgisServiceMapServiceMapAccountIDFkey foreignKey
}
func (f arcgisServiceMapForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.ArcgisServiceMapServiceMapAccountIDFkey,
}
}
type arcgisServiceMapUniques struct{}
func (u arcgisServiceMapUniques) AsSlice() []constraint {
return []constraint{}
}
type arcgisServiceMapChecks struct{}
func (c arcgisServiceMapChecks) AsSlice() []check {
return []check{}
}

View file

@ -1,237 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var ArcgisUsers = Table[
arcgisuserColumns,
arcgisuserIndexes,
arcgisuserForeignKeys,
arcgisuserUniques,
arcgisuserChecks,
]{
Schema: "arcgis",
Name: "user_",
Columns: arcgisuserColumns{
Access: column{
Name: "access",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Created: column{
Name: "created",
DBType: "timestamp without time zone",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Email: column{
Name: "email",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
FullName: column{
Name: "full_name",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
ID: column{
Name: "id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Level: column{
Name: "level",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
OrgID: column{
Name: "org_id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
PublicUserID: column{
Name: "public_user_id",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Region: column{
Name: "region",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Role: column{
Name: "role",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
RoleID: column{
Name: "role_id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Username: column{
Name: "username",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
UserLicenseTypeID: column{
Name: "user_license_type_id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
UserType: column{
Name: "user_type",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: arcgisuserIndexes{
UserPkey: index{
Type: "btree",
Name: "user__pkey",
Columns: []indexColumn{
{
Name: "id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "user__pkey",
Columns: []string{"id"},
Comment: "",
},
ForeignKeys: arcgisuserForeignKeys{
ArcgisUserUserPublicUserIDFkey: foreignKey{
constraint: constraint{
Name: "arcgis.user_.user__public_user_id_fkey",
Columns: []string{"public_user_id"},
Comment: "",
},
ForeignTable: "user_",
ForeignColumns: []string{"id"},
},
},
Comment: "",
}
type arcgisuserColumns struct {
Access column
Created column
Email column
FullName column
ID column
Level column
OrgID column
PublicUserID column
Region column
Role column
RoleID column
Username column
UserLicenseTypeID column
UserType column
}
func (c arcgisuserColumns) AsSlice() []column {
return []column{
c.Access, c.Created, c.Email, c.FullName, c.ID, c.Level, c.OrgID, c.PublicUserID, c.Region, c.Role, c.RoleID, c.Username, c.UserLicenseTypeID, c.UserType,
}
}
type arcgisuserIndexes struct {
UserPkey index
}
func (i arcgisuserIndexes) AsSlice() []index {
return []index{
i.UserPkey,
}
}
type arcgisuserForeignKeys struct {
ArcgisUserUserPublicUserIDFkey foreignKey
}
func (f arcgisuserForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.ArcgisUserUserPublicUserIDFkey,
}
}
type arcgisuserUniques struct{}
func (u arcgisuserUniques) AsSlice() []constraint {
return []constraint{}
}
type arcgisuserChecks struct{}
func (c arcgisuserChecks) AsSlice() []check {
return []check{}
}

View file

@ -1,122 +0,0 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var ArcgisUserPrivileges = Table[
arcgisUserPrivilegeColumns,
arcgisUserPrivilegeIndexes,
arcgisUserPrivilegeForeignKeys,
arcgisUserPrivilegeUniques,
arcgisUserPrivilegeChecks,
]{
Schema: "arcgis",
Name: "user_privilege",
Columns: arcgisUserPrivilegeColumns{
UserID: column{
Name: "user_id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Privilege: column{
Name: "privilege",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: arcgisUserPrivilegeIndexes{
UserPrivilegePkey: index{
Type: "btree",
Name: "user_privilege_pkey",
Columns: []indexColumn{
{
Name: "user_id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
{
Name: "privilege",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false, false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "user_privilege_pkey",
Columns: []string{"user_id", "privilege"},
Comment: "",
},
ForeignKeys: arcgisUserPrivilegeForeignKeys{
ArcgisUserPrivilegeUserPrivilegeUserIDFkey: foreignKey{
constraint: constraint{
Name: "arcgis.user_privilege.user_privilege_user_id_fkey",
Columns: []string{"user_id"},
Comment: "",
},
ForeignTable: "arcgis.user_",
ForeignColumns: []string{"id"},
},
},
Comment: "",
}
type arcgisUserPrivilegeColumns struct {
UserID column
Privilege column
}
func (c arcgisUserPrivilegeColumns) AsSlice() []column {
return []column{
c.UserID, c.Privilege,
}
}
type arcgisUserPrivilegeIndexes struct {
UserPrivilegePkey index
}
func (i arcgisUserPrivilegeIndexes) AsSlice() []index {
return []index{
i.UserPrivilegePkey,
}
}
type arcgisUserPrivilegeForeignKeys struct {
ArcgisUserPrivilegeUserPrivilegeUserIDFkey foreignKey
}
func (f arcgisUserPrivilegeForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.ArcgisUserPrivilegeUserPrivilegeUserIDFkey,
}
}
type arcgisUserPrivilegeUniques struct{}
func (u arcgisUserPrivilegeUniques) AsSlice() []constraint {
return []constraint{}
}
type arcgisUserPrivilegeChecks struct{}
func (c arcgisUserPrivilegeChecks) AsSlice() []check {
return []check{}
}

View file

@ -60,6 +60,15 @@ var CommsMailers = Table[
Generated: false,
AutoIncr: false,
},
ExternalID: column{
Name: "external_id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: commsMailerIndexes{
MailerPkey: index{
@ -101,16 +110,17 @@ var CommsMailers = Table[
}
type commsMailerColumns struct {
AddressID column
Created column
ID column
Recipient column
UUID column
AddressID column
Created column
ID column
Recipient column
UUID column
ExternalID column
}
func (c commsMailerColumns) AsSlice() []column {
return []column{
c.AddressID, c.Created, c.ID, c.Recipient, c.UUID,
c.AddressID, c.Created, c.ID, c.Recipient, c.UUID, c.ExternalID,
}
}

View file

@ -42,6 +42,15 @@ var CommsPhones = Table[
Generated: false,
AutoIncr: false,
},
CanSMS: column{
Name: "can_sms",
DBType: "boolean",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: commsPhoneIndexes{
PhonePkey: index{
@ -75,11 +84,12 @@ type commsPhoneColumns struct {
E164 column
IsSubscribed column
Status column
CanSMS column
}
func (c commsPhoneColumns) AsSlice() []column {
return []column{
c.E164, c.IsSubscribed, c.Status,
c.E164, c.IsSubscribed, c.Status, c.CanSMS,
}
}

View file

@ -0,0 +1,237 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var Communications = Table[
communicationColumns,
communicationIndexes,
communicationForeignKeys,
communicationUniques,
communicationChecks,
]{
Schema: "",
Name: "communication",
Columns: communicationColumns{
Created: column{
Name: "created",
DBType: "timestamp without time zone",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
ID: column{
Name: "id",
DBType: "integer",
Default: "nextval('communication_id_seq'::regclass)",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
OrganizationID: column{
Name: "organization_id",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
ResponseEmailLogID: column{
Name: "response_email_log_id",
DBType: "integer",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
ResponseTextLogID: column{
Name: "response_text_log_id",
DBType: "integer",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
SourceEmailLogID: column{
Name: "source_email_log_id",
DBType: "integer",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
SourceReportID: column{
Name: "source_report_id",
DBType: "integer",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
SourceTextLogID: column{
Name: "source_text_log_id",
DBType: "integer",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
Status: column{
Name: "status",
DBType: "public.communicationstatus",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: communicationIndexes{
CommunicationPkey: index{
Type: "btree",
Name: "communication_pkey",
Columns: []indexColumn{
{
Name: "id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "communication_pkey",
Columns: []string{"id"},
Comment: "",
},
ForeignKeys: communicationForeignKeys{
CommunicationCommunicationOrganizationIDFkey: foreignKey{
constraint: constraint{
Name: "communication.communication_organization_id_fkey",
Columns: []string{"organization_id"},
Comment: "",
},
ForeignTable: "organization",
ForeignColumns: []string{"id"},
},
CommunicationCommunicationResponseEmailLogIDFkey: foreignKey{
constraint: constraint{
Name: "communication.communication_response_email_log_id_fkey",
Columns: []string{"response_email_log_id"},
Comment: "",
},
ForeignTable: "comms.email_log",
ForeignColumns: []string{"id"},
},
CommunicationCommunicationResponseTextLogIDFkey: foreignKey{
constraint: constraint{
Name: "communication.communication_response_text_log_id_fkey",
Columns: []string{"response_text_log_id"},
Comment: "",
},
ForeignTable: "comms.text_log",
ForeignColumns: []string{"id"},
},
CommunicationCommunicationSourceEmailLogIDFkey: foreignKey{
constraint: constraint{
Name: "communication.communication_source_email_log_id_fkey",
Columns: []string{"source_email_log_id"},
Comment: "",
},
ForeignTable: "comms.email_log",
ForeignColumns: []string{"id"},
},
CommunicationCommunicationSourceReportIDFkey: foreignKey{
constraint: constraint{
Name: "communication.communication_source_report_id_fkey",
Columns: []string{"source_report_id"},
Comment: "",
},
ForeignTable: "publicreport.report",
ForeignColumns: []string{"id"},
},
CommunicationCommunicationSourceTextLogIDFkey: foreignKey{
constraint: constraint{
Name: "communication.communication_source_text_log_id_fkey",
Columns: []string{"source_text_log_id"},
Comment: "",
},
ForeignTable: "comms.text_log",
ForeignColumns: []string{"id"},
},
},
Comment: "",
}
type communicationColumns struct {
Created column
ID column
OrganizationID column
ResponseEmailLogID column
ResponseTextLogID column
SourceEmailLogID column
SourceReportID column
SourceTextLogID column
Status column
}
func (c communicationColumns) AsSlice() []column {
return []column{
c.Created, c.ID, c.OrganizationID, c.ResponseEmailLogID, c.ResponseTextLogID, c.SourceEmailLogID, c.SourceReportID, c.SourceTextLogID, c.Status,
}
}
type communicationIndexes struct {
CommunicationPkey index
}
func (i communicationIndexes) AsSlice() []index {
return []index{
i.CommunicationPkey,
}
}
type communicationForeignKeys struct {
CommunicationCommunicationOrganizationIDFkey foreignKey
CommunicationCommunicationResponseEmailLogIDFkey foreignKey
CommunicationCommunicationResponseTextLogIDFkey foreignKey
CommunicationCommunicationSourceEmailLogIDFkey foreignKey
CommunicationCommunicationSourceReportIDFkey foreignKey
CommunicationCommunicationSourceTextLogIDFkey foreignKey
}
func (f communicationForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.CommunicationCommunicationOrganizationIDFkey, f.CommunicationCommunicationResponseEmailLogIDFkey, f.CommunicationCommunicationResponseTextLogIDFkey, f.CommunicationCommunicationSourceEmailLogIDFkey, f.CommunicationCommunicationSourceReportIDFkey, f.CommunicationCommunicationSourceTextLogIDFkey,
}
}
type communicationUniques struct{}
func (u communicationUniques) AsSlice() []constraint {
return []constraint{}
}
type communicationChecks struct{}
func (c communicationChecks) AsSlice() []check {
return []check{}
}

View file

@ -0,0 +1,157 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var CommunicationLogEntries = Table[
communicationLogEntryColumns,
communicationLogEntryIndexes,
communicationLogEntryForeignKeys,
communicationLogEntryUniques,
communicationLogEntryChecks,
]{
Schema: "",
Name: "communication_log_entry",
Columns: communicationLogEntryColumns{
CommunicationID: column{
Name: "communication_id",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Created: column{
Name: "created",
DBType: "timestamp without time zone",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
ID: column{
Name: "id",
DBType: "integer",
Default: "nextval('communication_log_entry_id_seq'::regclass)",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Type: column{
Name: "type_",
DBType: "public.communicationlogentry",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
User: column{
Name: "user_",
DBType: "integer",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
},
Indexes: communicationLogEntryIndexes{
CommunicationLogEntryPkey: index{
Type: "btree",
Name: "communication_log_entry_pkey",
Columns: []indexColumn{
{
Name: "id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "communication_log_entry_pkey",
Columns: []string{"id"},
Comment: "",
},
ForeignKeys: communicationLogEntryForeignKeys{
CommunicationLogEntryCommunicationLogEntryCommunicationIDFkey: foreignKey{
constraint: constraint{
Name: "communication_log_entry.communication_log_entry_communication_id_fkey",
Columns: []string{"communication_id"},
Comment: "",
},
ForeignTable: "communication",
ForeignColumns: []string{"id"},
},
CommunicationLogEntryCommunicationLogEntryUserFkey: foreignKey{
constraint: constraint{
Name: "communication_log_entry.communication_log_entry_user__fkey",
Columns: []string{"user_"},
Comment: "",
},
ForeignTable: "user_",
ForeignColumns: []string{"id"},
},
},
Comment: "",
}
type communicationLogEntryColumns struct {
CommunicationID column
Created column
ID column
Type column
User column
}
func (c communicationLogEntryColumns) AsSlice() []column {
return []column{
c.CommunicationID, c.Created, c.ID, c.Type, c.User,
}
}
type communicationLogEntryIndexes struct {
CommunicationLogEntryPkey index
}
func (i communicationLogEntryIndexes) AsSlice() []index {
return []index{
i.CommunicationLogEntryPkey,
}
}
type communicationLogEntryForeignKeys struct {
CommunicationLogEntryCommunicationLogEntryCommunicationIDFkey foreignKey
CommunicationLogEntryCommunicationLogEntryUserFkey foreignKey
}
func (f communicationLogEntryForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.CommunicationLogEntryCommunicationLogEntryCommunicationIDFkey, f.CommunicationLogEntryCommunicationLogEntryUserFkey,
}
}
type communicationLogEntryUniques struct{}
func (u communicationLogEntryUniques) AsSlice() []constraint {
return []constraint{}
}
type communicationLogEntryChecks struct{}
func (c communicationLogEntryChecks) AsSlice() []check {
return []check{}
}

View file

@ -33,8 +33,34 @@ var ComplianceReportRequestMailers = Table[
Generated: false,
AutoIncr: false,
},
ID: column{
Name: "id",
DBType: "integer",
Default: "nextval('compliance_report_request_mailer_id_seq'::regclass)",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: complianceReportRequestMailerIndexes{
ComplianceReportRequestMailerPkey: index{
Type: "btree",
Name: "compliance_report_request_mailer_pkey",
Columns: []indexColumn{
{
Name: "id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
ComplianceReportRequestMaiComplianceReportRequestIDKey: index{
Type: "btree",
Name: "compliance_report_request_mai_compliance_report_request_id__key",
@ -58,7 +84,11 @@ var ComplianceReportRequestMailers = Table[
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "compliance_report_request_mailer_pkey",
Columns: []string{"id"},
Comment: "",
},
ForeignKeys: complianceReportRequestMailerForeignKeys{
ComplianceReportRequestMailerComplianceReportRequestMaiComplianceReportRequestIDFkey: foreignKey{
constraint: constraint{
@ -93,21 +123,23 @@ var ComplianceReportRequestMailers = Table[
type complianceReportRequestMailerColumns struct {
ComplianceReportRequestID column
MailerID column
ID column
}
func (c complianceReportRequestMailerColumns) AsSlice() []column {
return []column{
c.ComplianceReportRequestID, c.MailerID,
c.ComplianceReportRequestID, c.MailerID, c.ID,
}
}
type complianceReportRequestMailerIndexes struct {
ComplianceReportRequestMailerPkey index
ComplianceReportRequestMaiComplianceReportRequestIDKey index
}
func (i complianceReportRequestMailerIndexes) AsSlice() []index {
return []index{
i.ComplianceReportRequestMaiComplianceReportRequestIDKey,
i.ComplianceReportRequestMailerPkey, i.ComplianceReportRequestMaiComplianceReportRequestIDKey,
}
}

View file

@ -114,6 +114,15 @@ var FileuploadFiles = Table[
Generated: false,
AutoIncr: false,
},
Error: column{
Name: "error",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: fileuploadFileIndexes{
FilePkey: index{
@ -184,11 +193,12 @@ type fileuploadFileColumns struct {
SizeBytes column
FileUUID column
Committer column
Error column
}
func (c fileuploadFileColumns) AsSlice() []column {
return []column{
c.ID, c.ContentType, c.Created, c.CreatorID, c.Deleted, c.Name, c.OrganizationID, c.Status, c.SizeBytes, c.FileUUID, c.Committer,
c.ID, c.ContentType, c.Created, c.CreatorID, c.Deleted, c.Name, c.OrganizationID, c.Status, c.SizeBytes, c.FileUUID, c.Committer, c.Error,
}
}

View file

@ -222,6 +222,15 @@ var FileuploadPools = Table[
Generated: false,
AutoIncr: false,
},
AddressID: column{
Name: "address_id",
DBType: "integer",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
},
Indexes: fileuploadPoolIndexes{
PoolPkey: index{
@ -248,6 +257,15 @@ var FileuploadPools = Table[
Comment: "",
},
ForeignKeys: fileuploadPoolForeignKeys{
FileuploadPoolPoolAddressIDFkey: foreignKey{
constraint: constraint{
Name: "fileupload.pool.pool_address_id_fkey",
Columns: []string{"address_id"},
Comment: "",
},
ForeignTable: "address",
ForeignColumns: []string{"id"},
},
FileuploadPoolPoolCreatorIDFkey: foreignKey{
constraint: constraint{
Name: "fileupload.pool.pool_creator_id_fkey",
@ -313,11 +331,12 @@ type fileuploadPoolColumns struct {
AddressLocality column
AddressRegion column
Condition column
AddressID column
}
func (c fileuploadPoolColumns) AsSlice() []column {
return []column{
c.AddressPostalCode, c.AddressStreet, c.Committed, c.Created, c.CreatorID, c.CSVFile, c.Deleted, c.Geom, c.H3cell, c.ID, c.IsInDistrict, c.IsNew, c.Notes, c.PropertyOwnerName, c.PropertyOwnerPhoneE164, c.ResidentOwned, c.ResidentPhoneE164, c.LineNumber, c.Tags, c.AddressNumber, c.AddressLocality, c.AddressRegion, c.Condition,
c.AddressPostalCode, c.AddressStreet, c.Committed, c.Created, c.CreatorID, c.CSVFile, c.Deleted, c.Geom, c.H3cell, c.ID, c.IsInDistrict, c.IsNew, c.Notes, c.PropertyOwnerName, c.PropertyOwnerPhoneE164, c.ResidentOwned, c.ResidentPhoneE164, c.LineNumber, c.Tags, c.AddressNumber, c.AddressLocality, c.AddressRegion, c.Condition, c.AddressID,
}
}
@ -332,6 +351,7 @@ func (i fileuploadPoolIndexes) AsSlice() []index {
}
type fileuploadPoolForeignKeys struct {
FileuploadPoolPoolAddressIDFkey foreignKey
FileuploadPoolPoolCreatorIDFkey foreignKey
FileuploadPoolPoolCSVFileFkey foreignKey
FileuploadPoolPoolPropertyOwnerPhoneE164Fkey foreignKey
@ -340,7 +360,7 @@ type fileuploadPoolForeignKeys struct {
func (f fileuploadPoolForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.FileuploadPoolPoolCreatorIDFkey, f.FileuploadPoolPoolCSVFileFkey, f.FileuploadPoolPoolPropertyOwnerPhoneE164Fkey, f.FileuploadPoolPoolResidentPhoneE164Fkey,
f.FileuploadPoolPoolAddressIDFkey, f.FileuploadPoolPoolCreatorIDFkey, f.FileuploadPoolPoolCSVFileFkey, f.FileuploadPoolPoolPropertyOwnerPhoneE164Fkey, f.FileuploadPoolPoolResidentPhoneE164Fkey,
}
}

122
db/dbinfo/lob.event.bob.go Normal file
View file

@ -0,0 +1,122 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var LobEvents = Table[
lobEventColumns,
lobEventIndexes,
lobEventForeignKeys,
lobEventUniques,
lobEventChecks,
]{
Schema: "lob",
Name: "event",
Columns: lobEventColumns{
Created: column{
Name: "created",
DBType: "timestamp without time zone",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Body: column{
Name: "body",
DBType: "jsonb",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
ID: column{
Name: "id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Type: column{
Name: "type_",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: lobEventIndexes{
EventPkey: index{
Type: "btree",
Name: "event_pkey",
Columns: []indexColumn{
{
Name: "id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "event_pkey",
Columns: []string{"id"},
Comment: "",
},
Comment: "",
}
type lobEventColumns struct {
Created column
Body column
ID column
Type column
}
func (c lobEventColumns) AsSlice() []column {
return []column{
c.Created, c.Body, c.ID, c.Type,
}
}
type lobEventIndexes struct {
EventPkey index
}
func (i lobEventIndexes) AsSlice() []index {
return []index{
i.EventPkey,
}
}
type lobEventForeignKeys struct{}
func (f lobEventForeignKeys) AsSlice() []foreignKey {
return []foreignKey{}
}
type lobEventUniques struct{}
func (u lobEventUniques) AsSlice() []constraint {
return []constraint{}
}
type lobEventChecks struct{}
func (c lobEventChecks) AsSlice() []check {
return []check{}
}

View file

@ -0,0 +1,157 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var LogImpersonations = Table[
logImpersonationColumns,
logImpersonationIndexes,
logImpersonationForeignKeys,
logImpersonationUniques,
logImpersonationChecks,
]{
Schema: "",
Name: "log_impersonation",
Columns: logImpersonationColumns{
BeginAt: column{
Name: "begin_at",
DBType: "timestamp without time zone",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
EndAt: column{
Name: "end_at",
DBType: "timestamp without time zone",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
ID: column{
Name: "id",
DBType: "integer",
Default: "nextval('log_impersonation_id_seq'::regclass)",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
ImpersonatorID: column{
Name: "impersonator_id",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
TargetID: column{
Name: "target_id",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: logImpersonationIndexes{
LogImpersonationPkey: index{
Type: "btree",
Name: "log_impersonation_pkey",
Columns: []indexColumn{
{
Name: "id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "log_impersonation_pkey",
Columns: []string{"id"},
Comment: "",
},
ForeignKeys: logImpersonationForeignKeys{
LogImpersonationLogImpersonationImpersonatorIDFkey: foreignKey{
constraint: constraint{
Name: "log_impersonation.log_impersonation_impersonator_id_fkey",
Columns: []string{"impersonator_id"},
Comment: "",
},
ForeignTable: "user_",
ForeignColumns: []string{"id"},
},
LogImpersonationLogImpersonationTargetIDFkey: foreignKey{
constraint: constraint{
Name: "log_impersonation.log_impersonation_target_id_fkey",
Columns: []string{"target_id"},
Comment: "",
},
ForeignTable: "user_",
ForeignColumns: []string{"id"},
},
},
Comment: "",
}
type logImpersonationColumns struct {
BeginAt column
EndAt column
ID column
ImpersonatorID column
TargetID column
}
func (c logImpersonationColumns) AsSlice() []column {
return []column{
c.BeginAt, c.EndAt, c.ID, c.ImpersonatorID, c.TargetID,
}
}
type logImpersonationIndexes struct {
LogImpersonationPkey index
}
func (i logImpersonationIndexes) AsSlice() []index {
return []index{
i.LogImpersonationPkey,
}
}
type logImpersonationForeignKeys struct {
LogImpersonationLogImpersonationImpersonatorIDFkey foreignKey
LogImpersonationLogImpersonationTargetIDFkey foreignKey
}
func (f logImpersonationForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.LogImpersonationLogImpersonationImpersonatorIDFkey, f.LogImpersonationLogImpersonationTargetIDFkey,
}
}
type logImpersonationUniques struct{}
func (u logImpersonationUniques) AsSlice() []constraint {
return []constraint{}
}
type logImpersonationChecks struct{}
func (c logImpersonationChecks) AsSlice() []check {
return []check{}
}

View file

@ -78,6 +78,15 @@ var NoteImages = Table[
Generated: false,
AutoIncr: false,
},
ID: column{
Name: "id",
DBType: "integer",
Default: "IDENTITY",
Comment: "",
Nullable: false,
Generated: true,
AutoIncr: false,
},
},
Indexes: noteImageIndexes{
NoteImagePkey: index{
@ -102,6 +111,23 @@ var NoteImages = Table[
Where: "",
Include: []string{},
},
NoteImageIDUnique: index{
Type: "btree",
Name: "note_image_id_unique",
Columns: []indexColumn{
{
Name: "id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "note_image_pkey",
@ -137,6 +163,13 @@ var NoteImages = Table[
ForeignColumns: []string{"id"},
},
},
Uniques: noteImageUniques{
NoteImageIDUnique: constraint{
Name: "note_image_id_unique",
Columns: []string{"id"},
Comment: "",
},
},
Comment: "",
}
@ -149,21 +182,23 @@ type noteImageColumns struct {
OrganizationID column
Version column
UUID column
ID column
}
func (c noteImageColumns) AsSlice() []column {
return []column{
c.Created, c.CreatorID, c.Deleted, c.DeletorID, c.OrganizationID, c.Version, c.UUID,
c.Created, c.CreatorID, c.Deleted, c.DeletorID, c.OrganizationID, c.Version, c.UUID, c.ID,
}
}
type noteImageIndexes struct {
NoteImagePkey index
NoteImagePkey index
NoteImageIDUnique index
}
func (i noteImageIndexes) AsSlice() []index {
return []index{
i.NoteImagePkey,
i.NoteImagePkey, i.NoteImageIDUnique,
}
}
@ -179,10 +214,14 @@ func (f noteImageForeignKeys) AsSlice() []foreignKey {
}
}
type noteImageUniques struct{}
type noteImageUniques struct {
NoteImageIDUnique constraint
}
func (u noteImageUniques) AsSlice() []constraint {
return []constraint{}
return []constraint{
u.NoteImageIDUnique,
}
}
type noteImageChecks struct{}

View file

@ -321,6 +321,15 @@ var Organizations = Table[
Generated: false,
AutoIncr: false,
},
LobAddressID: column{
Name: "lob_address_id",
DBType: "text",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
},
Indexes: organizationIndexes{
OrganizationPkey: index{
@ -486,11 +495,12 @@ type organizationColumns struct {
FieldseekerServiceFeatureItemID column
ArcgisMapServiceID column
IsCatchall column
LobAddressID column
}
func (c organizationColumns) AsSlice() []column {
return []column{
c.ID, c.Name, c.ImportDistrictGid, c.Website, c.LogoUUID, c.Slug, c.GeneralManagerName, c.MailingAddressCity, c.MailingAddressPostalCode, c.MailingAddressStreet, c.OfficeAddressCity, c.OfficeAddressPostalCode, c.OfficeAddressStreet, c.ServiceAreaGeometry, c.ServiceAreaSquareMeters, c.ServiceAreaCentroid, c.ServiceAreaExtent, c.OfficeFax, c.OfficePhone, c.ServiceAreaXmin, c.ServiceAreaYmin, c.ServiceAreaXmax, c.ServiceAreaYmax, c.ServiceAreaCentroidGeojson, c.ServiceAreaCentroidX, c.ServiceAreaCentroidY, c.MailingAddressCountry, c.MailingAddressState, c.OfficeAddressCountry, c.OfficeAddressState, c.ArcgisAccountID, c.FieldseekerServiceFeatureItemID, c.ArcgisMapServiceID, c.IsCatchall,
c.ID, c.Name, c.ImportDistrictGid, c.Website, c.LogoUUID, c.Slug, c.GeneralManagerName, c.MailingAddressCity, c.MailingAddressPostalCode, c.MailingAddressStreet, c.OfficeAddressCity, c.OfficeAddressPostalCode, c.OfficeAddressStreet, c.ServiceAreaGeometry, c.ServiceAreaSquareMeters, c.ServiceAreaCentroid, c.ServiceAreaExtent, c.OfficeFax, c.OfficePhone, c.ServiceAreaXmin, c.ServiceAreaYmin, c.ServiceAreaXmax, c.ServiceAreaYmax, c.ServiceAreaCentroidGeojson, c.ServiceAreaCentroidX, c.ServiceAreaCentroidY, c.MailingAddressCountry, c.MailingAddressState, c.OfficeAddressCountry, c.OfficeAddressState, c.ArcgisAccountID, c.FieldseekerServiceFeatureItemID, c.ArcgisMapServiceID, c.IsCatchall, c.LobAddressID,
}
}

View file

@ -0,0 +1,112 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var PublicreportClients = Table[
publicreportClientColumns,
publicreportClientIndexes,
publicreportClientForeignKeys,
publicreportClientUniques,
publicreportClientChecks,
]{
Schema: "publicreport",
Name: "client",
Columns: publicreportClientColumns{
Created: column{
Name: "created",
DBType: "timestamp without time zone",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
UserAgent: column{
Name: "user_agent",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
UUID: column{
Name: "uuid",
DBType: "uuid",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: publicreportClientIndexes{
ClientPkey: index{
Type: "btree",
Name: "client_pkey",
Columns: []indexColumn{
{
Name: "uuid",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "client_pkey",
Columns: []string{"uuid"},
Comment: "",
},
Comment: "",
}
type publicreportClientColumns struct {
Created column
UserAgent column
UUID column
}
func (c publicreportClientColumns) AsSlice() []column {
return []column{
c.Created, c.UserAgent, c.UUID,
}
}
type publicreportClientIndexes struct {
ClientPkey index
}
func (i publicreportClientIndexes) AsSlice() []index {
return []index{
i.ClientPkey,
}
}
type publicreportClientForeignKeys struct{}
func (f publicreportClientForeignKeys) AsSlice() []foreignKey {
return []foreignKey{}
}
type publicreportClientUniques struct{}
func (u publicreportClientUniques) AsSlice() []constraint {
return []constraint{}
}
type publicreportClientChecks struct{}
func (c publicreportClientChecks) AsSlice() []check {
return []check{}
}

View file

@ -0,0 +1,197 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var PublicreportCompliances = Table[
publicreportComplianceColumns,
publicreportComplianceIndexes,
publicreportComplianceForeignKeys,
publicreportComplianceUniques,
publicreportComplianceChecks,
]{
Schema: "publicreport",
Name: "compliance",
Columns: publicreportComplianceColumns{
AccessInstructions: column{
Name: "access_instructions",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
AvailabilityNotes: column{
Name: "availability_notes",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Comments: column{
Name: "comments",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
GateCode: column{
Name: "gate_code",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
HasDog: column{
Name: "has_dog",
DBType: "boolean",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
PermissionType: column{
Name: "permission_type",
DBType: "publicreport.permissionaccess",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
ReportID: column{
Name: "report_id",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
ReportPhoneCanText: column{
Name: "report_phone_can_text",
DBType: "boolean",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
WantsScheduled: column{
Name: "wants_scheduled",
DBType: "boolean",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
Submitted: column{
Name: "submitted",
DBType: "timestamp without time zone",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
},
Indexes: publicreportComplianceIndexes{
CompliancePkey: index{
Type: "btree",
Name: "compliance_pkey",
Columns: []indexColumn{
{
Name: "report_id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "compliance_pkey",
Columns: []string{"report_id"},
Comment: "",
},
ForeignKeys: publicreportComplianceForeignKeys{
PublicreportComplianceComplianceReportIDFkey: foreignKey{
constraint: constraint{
Name: "publicreport.compliance.compliance_report_id_fkey",
Columns: []string{"report_id"},
Comment: "",
},
ForeignTable: "publicreport.report",
ForeignColumns: []string{"id"},
},
},
Comment: "",
}
type publicreportComplianceColumns struct {
AccessInstructions column
AvailabilityNotes column
Comments column
GateCode column
HasDog column
PermissionType column
ReportID column
ReportPhoneCanText column
WantsScheduled column
Submitted column
}
func (c publicreportComplianceColumns) AsSlice() []column {
return []column{
c.AccessInstructions, c.AvailabilityNotes, c.Comments, c.GateCode, c.HasDog, c.PermissionType, c.ReportID, c.ReportPhoneCanText, c.WantsScheduled, c.Submitted,
}
}
type publicreportComplianceIndexes struct {
CompliancePkey index
}
func (i publicreportComplianceIndexes) AsSlice() []index {
return []index{
i.CompliancePkey,
}
}
type publicreportComplianceForeignKeys struct {
PublicreportComplianceComplianceReportIDFkey foreignKey
}
func (f publicreportComplianceForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.PublicreportComplianceComplianceReportIDFkey,
}
}
type publicreportComplianceUniques struct{}
func (u publicreportComplianceUniques) AsSlice() []constraint {
return []constraint{}
}
type publicreportComplianceChecks struct{}
func (c publicreportComplianceChecks) AsSlice() []check {
return []check{}
}

View file

@ -24,60 +24,6 @@ var PublicreportReports = Table[
Generated: false,
AutoIncr: false,
},
AddressNumber: column{
Name: "address_number",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
AddressStreet: column{
Name: "address_street",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
AddressLocality: column{
Name: "address_locality",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
AddressRegion: column{
Name: "address_region",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
AddressPostalCode: column{
Name: "address_postal_code",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
AddressCountry: column{
Name: "address_country",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
AddressID: column{
Name: "address_id",
DBType: "integer",
@ -240,22 +186,31 @@ var PublicreportReports = Table[
Generated: false,
AutoIncr: false,
},
LocationLatitude: column{
Name: "location_latitude",
DBType: "double precision",
Default: "GENERATED",
AddressGid: column{
Name: "address_gid",
DBType: "text",
Default: "",
Comment: "",
Nullable: true,
Generated: true,
Nullable: false,
Generated: false,
AutoIncr: false,
},
LocationLongitude: column{
Name: "location_longitude",
DBType: "double precision",
Default: "GENERATED",
ClientUUID: column{
Name: "client_uuid",
DBType: "uuid",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: true,
Generated: false,
AutoIncr: false,
},
ReporterPhoneCanSMS: column{
Name: "reporter_phone_can_sms",
DBType: "boolean",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
@ -310,6 +265,15 @@ var PublicreportReports = Table[
ForeignTable: "address",
ForeignColumns: []string{"id"},
},
PublicreportReportReportClientUUIDFkey: foreignKey{
constraint: constraint{
Name: "publicreport.report.report_client_uuid_fkey",
Columns: []string{"client_uuid"},
Comment: "",
},
ForeignTable: "publicreport.client",
ForeignColumns: []string{"uuid"},
},
PublicreportReportReportOrganizationIDFkey: foreignKey{
constraint: constraint{
Name: "publicreport.report.report_organization_id_fkey",
@ -342,12 +306,6 @@ var PublicreportReports = Table[
type publicreportReportColumns struct {
AddressRaw column
AddressNumber column
AddressStreet column
AddressLocality column
AddressRegion column
AddressPostalCode column
AddressCountry column
AddressID column
Created column
Location column
@ -366,13 +324,14 @@ type publicreportReportColumns struct {
Reviewed column
ReviewerID column
Status column
LocationLatitude column
LocationLongitude column
AddressGid column
ClientUUID column
ReporterPhoneCanSMS column
}
func (c publicreportReportColumns) AsSlice() []column {
return []column{
c.AddressRaw, c.AddressNumber, c.AddressStreet, c.AddressLocality, c.AddressRegion, c.AddressPostalCode, c.AddressCountry, c.AddressID, c.Created, c.Location, c.H3cell, c.ID, c.LatlngAccuracyType, c.LatlngAccuracyValue, c.MapZoom, c.OrganizationID, c.PublicID, c.ReporterName, c.ReporterEmail, c.ReporterPhone, c.ReporterContactConsent, c.ReportType, c.Reviewed, c.ReviewerID, c.Status, c.LocationLatitude, c.LocationLongitude,
c.AddressRaw, c.AddressID, c.Created, c.Location, c.H3cell, c.ID, c.LatlngAccuracyType, c.LatlngAccuracyValue, c.MapZoom, c.OrganizationID, c.PublicID, c.ReporterName, c.ReporterEmail, c.ReporterPhone, c.ReporterContactConsent, c.ReportType, c.Reviewed, c.ReviewerID, c.Status, c.AddressGid, c.ClientUUID, c.ReporterPhoneCanSMS,
}
}
@ -389,13 +348,14 @@ func (i publicreportReportIndexes) AsSlice() []index {
type publicreportReportForeignKeys struct {
PublicreportReportReportAddressIDFkey foreignKey
PublicreportReportReportClientUUIDFkey foreignKey
PublicreportReportReportOrganizationIDFkey foreignKey
PublicreportReportReportReviewerIDFkey foreignKey
}
func (f publicreportReportForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.PublicreportReportReportAddressIDFkey, f.PublicreportReportReportOrganizationIDFkey, f.PublicreportReportReportReviewerIDFkey,
f.PublicreportReportReportAddressIDFkey, f.PublicreportReportReportClientUUIDFkey, f.PublicreportReportReportOrganizationIDFkey, f.PublicreportReportReportReviewerIDFkey,
}
}

View file

@ -168,6 +168,15 @@ var PublicreportWaters = Table[
Generated: false,
AutoIncr: false,
},
Duration: column{
Name: "duration",
DBType: "publicreport.nuisancedurationtype",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: publicreportWaterIndexes{
WaterPkey: index{
@ -226,11 +235,12 @@ type publicreportWaterColumns struct {
OwnerName column
OwnerPhone column
ReportID column
Duration column
}
func (c publicreportWaterColumns) AsSlice() []column {
return []column{
c.AccessComments, c.AccessGate, c.AccessFence, c.AccessLocked, c.AccessDog, c.AccessOther, c.Comments, c.IsReporterConfidential, c.IsReporterOwner, c.HasAdult, c.HasBackyardPermission, c.HasLarvae, c.HasPupae, c.OwnerEmail, c.OwnerName, c.OwnerPhone, c.ReportID,
c.AccessComments, c.AccessGate, c.AccessFence, c.AccessLocked, c.AccessDog, c.AccessOther, c.Comments, c.IsReporterConfidential, c.IsReporterOwner, c.HasAdult, c.HasBackyardPermission, c.HasLarvae, c.HasPupae, c.OwnerEmail, c.OwnerName, c.OwnerPhone, c.ReportID, c.Duration,
}
}

View file

@ -78,15 +78,6 @@ var Signals = Table[
Generated: false,
AutoIncr: false,
},
Title: column{
Name: "title",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Type: column{
Name: "type_",
DBType: "public.signaltype",
@ -96,6 +87,42 @@ var Signals = Table[
Generated: false,
AutoIncr: false,
},
SiteID: column{
Name: "site_id",
DBType: "integer",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
Location: column{
Name: "location",
DBType: "geometry",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
FeaturePoolFeatureID: column{
Name: "feature_pool_feature_id",
DBType: "integer",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
ReportID: column{
Name: "report_id",
DBType: "integer",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
},
Indexes: signalIndexes{
SignalPkey: index{
@ -115,6 +142,23 @@ var Signals = Table[
Where: "",
Include: []string{},
},
IdxSignalLocation: index{
Type: "gist",
Name: "idx_signal_location",
Columns: []indexColumn{
{
Name: "location",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: false,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "signal_pkey",
@ -140,6 +184,15 @@ var Signals = Table[
ForeignTable: "user_",
ForeignColumns: []string{"id"},
},
SignalSignalFeaturePoolFeatureIDFkey: foreignKey{
constraint: constraint{
Name: "signal.signal_feature_pool_feature_id_fkey",
Columns: []string{"feature_pool_feature_id"},
Comment: "",
},
ForeignTable: "feature_pool",
ForeignColumns: []string{"feature_id"},
},
SignalSignalOrganizationIDFkey: foreignKey{
constraint: constraint{
Name: "signal.signal_organization_id_fkey",
@ -149,48 +202,83 @@ var Signals = Table[
ForeignTable: "organization",
ForeignColumns: []string{"id"},
},
SignalSignalReportIDFkey: foreignKey{
constraint: constraint{
Name: "signal.signal_report_id_fkey",
Columns: []string{"report_id"},
Comment: "",
},
ForeignTable: "publicreport.report",
ForeignColumns: []string{"id"},
},
SignalSignalSiteIDFkey: foreignKey{
constraint: constraint{
Name: "signal.signal_site_id_fkey",
Columns: []string{"site_id"},
Comment: "",
},
ForeignTable: "site",
ForeignColumns: []string{"id"},
},
},
Checks: signalChecks{
CheckExclusiveReference: check{
constraint: constraint{
Name: "check_exclusive_reference",
Columns: []string{"feature_pool_feature_id", "report_id"},
Comment: "",
},
Expression: "((feature_pool_feature_id IS NULL) OR (report_id IS NULL))",
},
},
Comment: "",
}
type signalColumns struct {
Addressed column
Addressor column
Created column
Creator column
ID column
OrganizationID column
Species column
Title column
Type column
Addressed column
Addressor column
Created column
Creator column
ID column
OrganizationID column
Species column
Type column
SiteID column
Location column
FeaturePoolFeatureID column
ReportID column
}
func (c signalColumns) AsSlice() []column {
return []column{
c.Addressed, c.Addressor, c.Created, c.Creator, c.ID, c.OrganizationID, c.Species, c.Title, c.Type,
c.Addressed, c.Addressor, c.Created, c.Creator, c.ID, c.OrganizationID, c.Species, c.Type, c.SiteID, c.Location, c.FeaturePoolFeatureID, c.ReportID,
}
}
type signalIndexes struct {
SignalPkey index
SignalPkey index
IdxSignalLocation index
}
func (i signalIndexes) AsSlice() []index {
return []index{
i.SignalPkey,
i.SignalPkey, i.IdxSignalLocation,
}
}
type signalForeignKeys struct {
SignalSignalAddressorFkey foreignKey
SignalSignalCreatorFkey foreignKey
SignalSignalOrganizationIDFkey foreignKey
SignalSignalAddressorFkey foreignKey
SignalSignalCreatorFkey foreignKey
SignalSignalFeaturePoolFeatureIDFkey foreignKey
SignalSignalOrganizationIDFkey foreignKey
SignalSignalReportIDFkey foreignKey
SignalSignalSiteIDFkey foreignKey
}
func (f signalForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.SignalSignalAddressorFkey, f.SignalSignalCreatorFkey, f.SignalSignalOrganizationIDFkey,
f.SignalSignalAddressorFkey, f.SignalSignalCreatorFkey, f.SignalSignalFeaturePoolFeatureIDFkey, f.SignalSignalOrganizationIDFkey, f.SignalSignalReportIDFkey, f.SignalSignalSiteIDFkey,
}
}
@ -200,8 +288,12 @@ func (u signalUniques) AsSlice() []constraint {
return []constraint{}
}
type signalChecks struct{}
type signalChecks struct {
CheckExclusiveReference check
}
func (c signalChecks) AsSlice() []check {
return []check{}
return []check{
c.CheckExclusiveReference,
}
}

View file

@ -15,15 +15,6 @@ var TileCachedImages = Table[
Schema: "tile",
Name: "cached_image",
Columns: tileCachedImageColumns{
ArcgisID: column{
Name: "arcgis_id",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
X: column{
Name: "x",
DBType: "integer",
@ -60,6 +51,15 @@ var TileCachedImages = Table[
Generated: false,
AutoIncr: false,
},
ServiceID: column{
Name: "service_id",
DBType: "integer",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: tileCachedImageIndexes{
CachedImagePkey: index{
@ -67,7 +67,7 @@ var TileCachedImages = Table[
Name: "cached_image_pkey",
Columns: []indexColumn{
{
Name: "arcgis_id",
Name: "service_id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
@ -97,18 +97,18 @@ var TileCachedImages = Table[
},
PrimaryKey: &constraint{
Name: "cached_image_pkey",
Columns: []string{"arcgis_id", "x", "y", "z"},
Columns: []string{"service_id", "x", "y", "z"},
Comment: "",
},
ForeignKeys: tileCachedImageForeignKeys{
TileCachedImageCachedImageArcgisIDFkey: foreignKey{
TileCachedImageCachedImageServiceIDFkey: foreignKey{
constraint: constraint{
Name: "tile.cached_image.cached_image_arcgis_id_fkey",
Columns: []string{"arcgis_id"},
Name: "tile.cached_image.cached_image_service_id_fkey",
Columns: []string{"service_id"},
Comment: "",
},
ForeignTable: "arcgis.service_map",
ForeignColumns: []string{"arcgis_id"},
ForeignTable: "tile.service",
ForeignColumns: []string{"id"},
},
},
@ -116,16 +116,16 @@ var TileCachedImages = Table[
}
type tileCachedImageColumns struct {
ArcgisID column
X column
Y column
Z column
IsEmpty column
X column
Y column
Z column
IsEmpty column
ServiceID column
}
func (c tileCachedImageColumns) AsSlice() []column {
return []column{
c.ArcgisID, c.X, c.Y, c.Z, c.IsEmpty,
c.X, c.Y, c.Z, c.IsEmpty, c.ServiceID,
}
}
@ -140,12 +140,12 @@ func (i tileCachedImageIndexes) AsSlice() []index {
}
type tileCachedImageForeignKeys struct {
TileCachedImageCachedImageArcgisIDFkey foreignKey
TileCachedImageCachedImageServiceIDFkey foreignKey
}
func (f tileCachedImageForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.TileCachedImageCachedImageArcgisIDFkey,
f.TileCachedImageCachedImageServiceIDFkey,
}
}

View file

@ -0,0 +1,156 @@
// Code generated by BobGen psql v0.42.5. DO NOT EDIT.
// This file is meant to be re-generated in place and/or deleted at any time.
package dbinfo
import "github.com/aarondl/opt/null"
var TileServices = Table[
tileServiceColumns,
tileServiceIndexes,
tileServiceForeignKeys,
tileServiceUniques,
tileServiceChecks,
]{
Schema: "tile",
Name: "service",
Columns: tileServiceColumns{
ID: column{
Name: "id",
DBType: "integer",
Default: "nextval('tile.service_id_seq'::regclass)",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
Name: column{
Name: "name",
DBType: "text",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
ArcgisID: column{
Name: "arcgis_id",
DBType: "text",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
},
Indexes: tileServiceIndexes{
ServicePkey: index{
Type: "btree",
Name: "service_pkey",
Columns: []indexColumn{
{
Name: "id",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
ServiceNameUnique: index{
Type: "btree",
Name: "service_name_unique",
Columns: []indexColumn{
{
Name: "name",
Desc: null.FromCond(false, true),
IsExpression: false,
},
},
Unique: true,
Comment: "",
NullsFirst: []bool{false},
NullsDistinct: false,
Where: "",
Include: []string{},
},
},
PrimaryKey: &constraint{
Name: "service_pkey",
Columns: []string{"id"},
Comment: "",
},
ForeignKeys: tileServiceForeignKeys{
TileServiceServiceArcgisIDFkey: foreignKey{
constraint: constraint{
Name: "tile.service.service_arcgis_id_fkey",
Columns: []string{"arcgis_id"},
Comment: "",
},
ForeignTable: "arcgis.service_map",
ForeignColumns: []string{"arcgis_id"},
},
},
Uniques: tileServiceUniques{
ServiceNameUnique: constraint{
Name: "service_name_unique",
Columns: []string{"name"},
Comment: "",
},
},
Comment: "",
}
type tileServiceColumns struct {
ID column
Name column
ArcgisID column
}
func (c tileServiceColumns) AsSlice() []column {
return []column{
c.ID, c.Name, c.ArcgisID,
}
}
type tileServiceIndexes struct {
ServicePkey index
ServiceNameUnique index
}
func (i tileServiceIndexes) AsSlice() []index {
return []index{
i.ServicePkey, i.ServiceNameUnique,
}
}
type tileServiceForeignKeys struct {
TileServiceServiceArcgisIDFkey foreignKey
}
func (f tileServiceForeignKeys) AsSlice() []foreignKey {
return []foreignKey{
f.TileServiceServiceArcgisIDFkey,
}
}
type tileServiceUniques struct {
ServiceNameUnique constraint
}
func (u tileServiceUniques) AsSlice() []constraint {
return []constraint{
u.ServiceNameUnique,
}
}
type tileServiceChecks struct{}
func (c tileServiceChecks) AsSlice() []check {
return []check{}
}

View file

@ -132,6 +132,42 @@ var Users = Table[
Generated: false,
AutoIncr: false,
},
Avatar: column{
Name: "avatar",
DBType: "uuid",
Default: "NULL",
Comment: "",
Nullable: true,
Generated: false,
AutoIncr: false,
},
IsActive: column{
Name: "is_active",
DBType: "boolean",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
IsDronePilot: column{
Name: "is_drone_pilot",
DBType: "boolean",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
IsWarrant: column{
Name: "is_warrant",
DBType: "boolean",
Default: "",
Comment: "",
Nullable: false,
Generated: false,
AutoIncr: false,
},
},
Indexes: userIndexes{
UserPkey: index{
@ -210,11 +246,15 @@ type userColumns struct {
PasswordHashType column
PasswordHash column
Role column
Avatar column
IsActive column
IsDronePilot column
IsWarrant column
}
func (c userColumns) AsSlice() []column {
return []column{
c.ID, c.ArcgisAccessToken, c.ArcgisLicense, c.ArcgisRefreshToken, c.ArcgisRefreshTokenExpires, c.ArcgisRole, c.DisplayName, c.Email, c.OrganizationID, c.Username, c.PasswordHashType, c.PasswordHash, c.Role,
c.ID, c.ArcgisAccessToken, c.ArcgisLicense, c.ArcgisRefreshToken, c.ArcgisRefreshTokenExpires, c.ArcgisRole, c.DisplayName, c.Email, c.OrganizationID, c.Username, c.PasswordHashType, c.PasswordHash, c.Role, c.Avatar, c.IsActive, c.IsDronePilot, c.IsWarrant,
}
}

View file

@ -8,270 +8,6 @@ import (
"fmt"
)
// Enum values for ArcgisFieldtype
const (
ArcgisFieldtypeEsrifieldtypesmallinteger ArcgisFieldtype = "esriFieldTypeSmallInteger"
ArcgisFieldtypeEsrifieldtypeinteger ArcgisFieldtype = "esriFieldTypeInteger"
ArcgisFieldtypeEsrifieldtypesingle ArcgisFieldtype = "esriFieldTypeSingle"
ArcgisFieldtypeEsrifieldtypedouble ArcgisFieldtype = "esriFieldTypeDouble"
ArcgisFieldtypeEsrifieldtypestring ArcgisFieldtype = "esriFieldTypeString"
ArcgisFieldtypeEsrifieldtypedate ArcgisFieldtype = "esriFieldTypeDate"
ArcgisFieldtypeEsrifieldtypeoid ArcgisFieldtype = "esriFieldTypeOID"
ArcgisFieldtypeEsrifieldtypegeometry ArcgisFieldtype = "esriFieldTypeGeometry"
ArcgisFieldtypeEsrifieldtypeblob ArcgisFieldtype = "esriFieldTypeBlob"
ArcgisFieldtypeEsrifieldtyperaster ArcgisFieldtype = "esriFieldTypeRaster"
ArcgisFieldtypeEsrifieldtypeguid ArcgisFieldtype = "esriFieldTypeGUID"
ArcgisFieldtypeEsrifieldtypeglobalid ArcgisFieldtype = "esriFieldTypeGlobalID"
ArcgisFieldtypeEsrifieldtypexml ArcgisFieldtype = "esriFieldTypeXML"
ArcgisFieldtypeEsrifieldtypebiginteger ArcgisFieldtype = "esriFieldTypeBigInteger"
)
func AllArcgisFieldtype() []ArcgisFieldtype {
return []ArcgisFieldtype{
ArcgisFieldtypeEsrifieldtypesmallinteger,
ArcgisFieldtypeEsrifieldtypeinteger,
ArcgisFieldtypeEsrifieldtypesingle,
ArcgisFieldtypeEsrifieldtypedouble,
ArcgisFieldtypeEsrifieldtypestring,
ArcgisFieldtypeEsrifieldtypedate,
ArcgisFieldtypeEsrifieldtypeoid,
ArcgisFieldtypeEsrifieldtypegeometry,
ArcgisFieldtypeEsrifieldtypeblob,
ArcgisFieldtypeEsrifieldtyperaster,
ArcgisFieldtypeEsrifieldtypeguid,
ArcgisFieldtypeEsrifieldtypeglobalid,
ArcgisFieldtypeEsrifieldtypexml,
ArcgisFieldtypeEsrifieldtypebiginteger,
}
}
type ArcgisFieldtype string
func (e ArcgisFieldtype) String() string {
return string(e)
}
func (e ArcgisFieldtype) Valid() bool {
switch e {
case ArcgisFieldtypeEsrifieldtypesmallinteger,
ArcgisFieldtypeEsrifieldtypeinteger,
ArcgisFieldtypeEsrifieldtypesingle,
ArcgisFieldtypeEsrifieldtypedouble,
ArcgisFieldtypeEsrifieldtypestring,
ArcgisFieldtypeEsrifieldtypedate,
ArcgisFieldtypeEsrifieldtypeoid,
ArcgisFieldtypeEsrifieldtypegeometry,
ArcgisFieldtypeEsrifieldtypeblob,
ArcgisFieldtypeEsrifieldtyperaster,
ArcgisFieldtypeEsrifieldtypeguid,
ArcgisFieldtypeEsrifieldtypeglobalid,
ArcgisFieldtypeEsrifieldtypexml,
ArcgisFieldtypeEsrifieldtypebiginteger:
return true
default:
return false
}
}
// useful when testing in other packages
func (e ArcgisFieldtype) All() []ArcgisFieldtype {
return AllArcgisFieldtype()
}
func (e ArcgisFieldtype) MarshalText() ([]byte, error) {
return []byte(e), nil
}
func (e *ArcgisFieldtype) UnmarshalText(text []byte) error {
return e.Scan(text)
}
func (e ArcgisFieldtype) MarshalBinary() ([]byte, error) {
return []byte(e), nil
}
func (e *ArcgisFieldtype) UnmarshalBinary(data []byte) error {
return e.Scan(data)
}
func (e ArcgisFieldtype) Value() (driver.Value, error) {
return string(e), nil
}
func (e *ArcgisFieldtype) Scan(value any) error {
switch x := value.(type) {
case string:
*e = ArcgisFieldtype(x)
case []byte:
*e = ArcgisFieldtype(x)
case nil:
return fmt.Errorf("cannot nil into ArcgisFieldtype")
default:
return fmt.Errorf("cannot scan type %T: %v", value, value)
}
if !e.Valid() {
return fmt.Errorf("invalid ArcgisFieldtype value: %s", *e)
}
return nil
}
// Enum values for ArcgisMappingdestinationaddress
const (
ArcgisMappingdestinationaddressCountry ArcgisMappingdestinationaddress = "country"
ArcgisMappingdestinationaddressLocality ArcgisMappingdestinationaddress = "locality"
ArcgisMappingdestinationaddressPostalCode ArcgisMappingdestinationaddress = "postal_code"
ArcgisMappingdestinationaddressStreet ArcgisMappingdestinationaddress = "street"
ArcgisMappingdestinationaddressUnit ArcgisMappingdestinationaddress = "unit"
)
func AllArcgisMappingdestinationaddress() []ArcgisMappingdestinationaddress {
return []ArcgisMappingdestinationaddress{
ArcgisMappingdestinationaddressCountry,
ArcgisMappingdestinationaddressLocality,
ArcgisMappingdestinationaddressPostalCode,
ArcgisMappingdestinationaddressStreet,
ArcgisMappingdestinationaddressUnit,
}
}
type ArcgisMappingdestinationaddress string
func (e ArcgisMappingdestinationaddress) String() string {
return string(e)
}
func (e ArcgisMappingdestinationaddress) Valid() bool {
switch e {
case ArcgisMappingdestinationaddressCountry,
ArcgisMappingdestinationaddressLocality,
ArcgisMappingdestinationaddressPostalCode,
ArcgisMappingdestinationaddressStreet,
ArcgisMappingdestinationaddressUnit:
return true
default:
return false
}
}
// useful when testing in other packages
func (e ArcgisMappingdestinationaddress) All() []ArcgisMappingdestinationaddress {
return AllArcgisMappingdestinationaddress()
}
func (e ArcgisMappingdestinationaddress) MarshalText() ([]byte, error) {
return []byte(e), nil
}
func (e *ArcgisMappingdestinationaddress) UnmarshalText(text []byte) error {
return e.Scan(text)
}
func (e ArcgisMappingdestinationaddress) MarshalBinary() ([]byte, error) {
return []byte(e), nil
}
func (e *ArcgisMappingdestinationaddress) UnmarshalBinary(data []byte) error {
return e.Scan(data)
}
func (e ArcgisMappingdestinationaddress) Value() (driver.Value, error) {
return string(e), nil
}
func (e *ArcgisMappingdestinationaddress) Scan(value any) error {
switch x := value.(type) {
case string:
*e = ArcgisMappingdestinationaddress(x)
case []byte:
*e = ArcgisMappingdestinationaddress(x)
case nil:
return fmt.Errorf("cannot nil into ArcgisMappingdestinationaddress")
default:
return fmt.Errorf("cannot scan type %T: %v", value, value)
}
if !e.Valid() {
return fmt.Errorf("invalid ArcgisMappingdestinationaddress value: %s", *e)
}
return nil
}
// Enum values for ArcgisMappingdestinationparcel
const (
ArcgisMappingdestinationparcelApn ArcgisMappingdestinationparcel = "apn"
ArcgisMappingdestinationparcelDescription ArcgisMappingdestinationparcel = "description"
)
func AllArcgisMappingdestinationparcel() []ArcgisMappingdestinationparcel {
return []ArcgisMappingdestinationparcel{
ArcgisMappingdestinationparcelApn,
ArcgisMappingdestinationparcelDescription,
}
}
type ArcgisMappingdestinationparcel string
func (e ArcgisMappingdestinationparcel) String() string {
return string(e)
}
func (e ArcgisMappingdestinationparcel) Valid() bool {
switch e {
case ArcgisMappingdestinationparcelApn,
ArcgisMappingdestinationparcelDescription:
return true
default:
return false
}
}
// useful when testing in other packages
func (e ArcgisMappingdestinationparcel) All() []ArcgisMappingdestinationparcel {
return AllArcgisMappingdestinationparcel()
}
func (e ArcgisMappingdestinationparcel) MarshalText() ([]byte, error) {
return []byte(e), nil
}
func (e *ArcgisMappingdestinationparcel) UnmarshalText(text []byte) error {
return e.Scan(text)
}
func (e ArcgisMappingdestinationparcel) MarshalBinary() ([]byte, error) {
return []byte(e), nil
}
func (e *ArcgisMappingdestinationparcel) UnmarshalBinary(data []byte) error {
return e.Scan(data)
}
func (e ArcgisMappingdestinationparcel) Value() (driver.Value, error) {
return string(e), nil
}
func (e *ArcgisMappingdestinationparcel) Scan(value any) error {
switch x := value.(type) {
case string:
*e = ArcgisMappingdestinationparcel(x)
case []byte:
*e = ArcgisMappingdestinationparcel(x)
case nil:
return fmt.Errorf("cannot nil into ArcgisMappingdestinationparcel")
default:
return fmt.Errorf("cannot scan type %T: %v", value, value)
}
if !e.Valid() {
return fmt.Errorf("invalid ArcgisMappingdestinationparcel value: %s", *e)
}
return nil
}
// Enum values for Arcgislicensetype
const (
ArcgislicensetypeAdvancedut Arcgislicensetype = "advancedUT"
@ -846,26 +582,44 @@ func (e *CommsTextorigin) Scan(value any) error {
return nil
}
// Enum values for Countrytype
// Enum values for Communicationlogentry
const (
CountrytypeUsa Countrytype = "usa"
CommunicationlogentryCreated Communicationlogentry = "created"
CommunicationlogentryStatusU2eclosed Communicationlogentry = "status.closed"
CommunicationlogentryStatusU2einvalidated Communicationlogentry = "status.invalidated"
CommunicationlogentryStatusU2eopened Communicationlogentry = "status.opened"
CommunicationlogentryStatusU2epending Communicationlogentry = "status.pending"
CommunicationlogentryStatusU2epossibleIssue Communicationlogentry = "status.possible-issue"
CommunicationlogentryStatusU2epossibleResolved Communicationlogentry = "status.possible-resolved"
)
func AllCountrytype() []Countrytype {
return []Countrytype{
CountrytypeUsa,
func AllCommunicationlogentry() []Communicationlogentry {
return []Communicationlogentry{
CommunicationlogentryCreated,
CommunicationlogentryStatusU2eclosed,
CommunicationlogentryStatusU2einvalidated,
CommunicationlogentryStatusU2eopened,
CommunicationlogentryStatusU2epending,
CommunicationlogentryStatusU2epossibleIssue,
CommunicationlogentryStatusU2epossibleResolved,
}
}
type Countrytype string
type Communicationlogentry string
func (e Countrytype) String() string {
func (e Communicationlogentry) String() string {
return string(e)
}
func (e Countrytype) Valid() bool {
func (e Communicationlogentry) Valid() bool {
switch e {
case CountrytypeUsa:
case CommunicationlogentryCreated,
CommunicationlogentryStatusU2eclosed,
CommunicationlogentryStatusU2einvalidated,
CommunicationlogentryStatusU2eopened,
CommunicationlogentryStatusU2epending,
CommunicationlogentryStatusU2epossibleIssue,
CommunicationlogentryStatusU2epossibleResolved:
return true
default:
return false
@ -873,44 +627,226 @@ func (e Countrytype) Valid() bool {
}
// useful when testing in other packages
func (e Countrytype) All() []Countrytype {
return AllCountrytype()
func (e Communicationlogentry) All() []Communicationlogentry {
return AllCommunicationlogentry()
}
func (e Countrytype) MarshalText() ([]byte, error) {
func (e Communicationlogentry) MarshalText() ([]byte, error) {
return []byte(e), nil
}
func (e *Countrytype) UnmarshalText(text []byte) error {
func (e *Communicationlogentry) UnmarshalText(text []byte) error {
return e.Scan(text)
}
func (e Countrytype) MarshalBinary() ([]byte, error) {
func (e Communicationlogentry) MarshalBinary() ([]byte, error) {
return []byte(e), nil
}
func (e *Countrytype) UnmarshalBinary(data []byte) error {
func (e *Communicationlogentry) UnmarshalBinary(data []byte) error {
return e.Scan(data)
}
func (e Countrytype) Value() (driver.Value, error) {
func (e Communicationlogentry) Value() (driver.Value, error) {
return string(e), nil
}
func (e *Countrytype) Scan(value any) error {
func (e *Communicationlogentry) Scan(value any) error {
switch x := value.(type) {
case string:
*e = Countrytype(x)
*e = Communicationlogentry(x)
case []byte:
*e = Countrytype(x)
*e = Communicationlogentry(x)
case nil:
return fmt.Errorf("cannot nil into Countrytype")
return fmt.Errorf("cannot nil into Communicationlogentry")
default:
return fmt.Errorf("cannot scan type %T: %v", value, value)
}
if !e.Valid() {
return fmt.Errorf("invalid Countrytype value: %s", *e)
return fmt.Errorf("invalid Communicationlogentry value: %s", *e)
}
return nil
}
// Enum values for Communicationstatus
const (
CommunicationstatusClosed Communicationstatus = "closed"
CommunicationstatusInvalid Communicationstatus = "invalid"
CommunicationstatusNew Communicationstatus = "new"
CommunicationstatusOpened Communicationstatus = "opened"
CommunicationstatusPending Communicationstatus = "pending"
CommunicationstatusPossibleIssue Communicationstatus = "possible-issue"
CommunicationstatusPossibleResolved Communicationstatus = "possible-resolved"
CommunicationstatusResolved Communicationstatus = "resolved"
)
func AllCommunicationstatus() []Communicationstatus {
return []Communicationstatus{
CommunicationstatusClosed,
CommunicationstatusInvalid,
CommunicationstatusNew,
CommunicationstatusOpened,
CommunicationstatusPending,
CommunicationstatusPossibleIssue,
CommunicationstatusPossibleResolved,
CommunicationstatusResolved,
}
}
type Communicationstatus string
func (e Communicationstatus) String() string {
return string(e)
}
func (e Communicationstatus) Valid() bool {
switch e {
case CommunicationstatusClosed,
CommunicationstatusInvalid,
CommunicationstatusNew,
CommunicationstatusOpened,
CommunicationstatusPending,
CommunicationstatusPossibleIssue,
CommunicationstatusPossibleResolved,
CommunicationstatusResolved:
return true
default:
return false
}
}
// useful when testing in other packages
func (e Communicationstatus) All() []Communicationstatus {
return AllCommunicationstatus()
}
func (e Communicationstatus) MarshalText() ([]byte, error) {
return []byte(e), nil
}
func (e *Communicationstatus) UnmarshalText(text []byte) error {
return e.Scan(text)
}
func (e Communicationstatus) MarshalBinary() ([]byte, error) {
return []byte(e), nil
}
func (e *Communicationstatus) UnmarshalBinary(data []byte) error {
return e.Scan(data)
}
func (e Communicationstatus) Value() (driver.Value, error) {
return string(e), nil
}
func (e *Communicationstatus) Scan(value any) error {
switch x := value.(type) {
case string:
*e = Communicationstatus(x)
case []byte:
*e = Communicationstatus(x)
case nil:
return fmt.Errorf("cannot nil into Communicationstatus")
default:
return fmt.Errorf("cannot scan type %T: %v", value, value)
}
if !e.Valid() {
return fmt.Errorf("invalid Communicationstatus value: %s", *e)
}
return nil
}
// Enum values for Communicationstatustype
const (
CommunicationstatustypeClosed Communicationstatustype = "closed"
CommunicationstatustypeInvalid Communicationstatustype = "invalid"
CommunicationstatustypeNew Communicationstatustype = "new"
CommunicationstatustypeOpened Communicationstatustype = "opened"
CommunicationstatustypePending Communicationstatustype = "pending"
CommunicationstatustypePossibleIssue Communicationstatustype = "possible-issue"
CommunicationstatustypePossibleResolved Communicationstatustype = "possible-resolved"
CommunicationstatustypeResolved Communicationstatustype = "resolved"
)
func AllCommunicationstatustype() []Communicationstatustype {
return []Communicationstatustype{
CommunicationstatustypeClosed,
CommunicationstatustypeInvalid,
CommunicationstatustypeNew,
CommunicationstatustypeOpened,
CommunicationstatustypePending,
CommunicationstatustypePossibleIssue,
CommunicationstatustypePossibleResolved,
CommunicationstatustypeResolved,
}
}
type Communicationstatustype string
func (e Communicationstatustype) String() string {
return string(e)
}
func (e Communicationstatustype) Valid() bool {
switch e {
case CommunicationstatustypeClosed,
CommunicationstatustypeInvalid,
CommunicationstatustypeNew,
CommunicationstatustypeOpened,
CommunicationstatustypePending,
CommunicationstatustypePossibleIssue,
CommunicationstatustypePossibleResolved,
CommunicationstatustypeResolved:
return true
default:
return false
}
}
// useful when testing in other packages
func (e Communicationstatustype) All() []Communicationstatustype {
return AllCommunicationstatustype()
}
func (e Communicationstatustype) MarshalText() ([]byte, error) {
return []byte(e), nil
}
func (e *Communicationstatustype) UnmarshalText(text []byte) error {
return e.Scan(text)
}
func (e Communicationstatustype) MarshalBinary() ([]byte, error) {
return []byte(e), nil
}
func (e *Communicationstatustype) UnmarshalBinary(data []byte) error {
return e.Scan(data)
}
func (e Communicationstatustype) Value() (driver.Value, error) {
return string(e), nil
}
func (e *Communicationstatustype) Scan(value any) error {
switch x := value.(type) {
case string:
*e = Communicationstatustype(x)
case []byte:
*e = Communicationstatustype(x)
case nil:
return fmt.Errorf("cannot nil into Communicationstatustype")
default:
return fmt.Errorf("cannot scan type %T: %v", value, value)
}
if !e.Valid() {
return fmt.Errorf("invalid Communicationstatustype value: %s", *e)
}
return nil
@ -1304,6 +1240,7 @@ const (
JobtypeEmailSend Jobtype = "email-send"
JobtypeTextRespond Jobtype = "text-respond"
JobtypeTextSend Jobtype = "text-send"
JobtypeComplianceMailerSend Jobtype = "compliance-mailer-send"
)
func AllJobtype() []Jobtype {
@ -1315,6 +1252,7 @@ func AllJobtype() []Jobtype {
JobtypeEmailSend,
JobtypeTextRespond,
JobtypeTextSend,
JobtypeComplianceMailerSend,
}
}
@ -1332,7 +1270,8 @@ func (e Jobtype) Valid() bool {
JobtypeLabelStudioAudioCreate,
JobtypeEmailSend,
JobtypeTextRespond,
JobtypeTextSend:
JobtypeTextSend,
JobtypeComplianceMailerSend:
return true
default:
return false
@ -1881,6 +1820,85 @@ func (e *PublicreportNuisancedurationtype) Scan(value any) error {
return nil
}
// Enum values for PublicreportPermissionaccess
const (
PublicreportPermissionaccessDenied PublicreportPermissionaccess = "denied"
PublicreportPermissionaccessGranted PublicreportPermissionaccess = "granted"
PublicreportPermissionaccessUnselected PublicreportPermissionaccess = "unselected"
PublicreportPermissionaccessWithOwner PublicreportPermissionaccess = "with-owner"
)
func AllPublicreportPermissionaccess() []PublicreportPermissionaccess {
return []PublicreportPermissionaccess{
PublicreportPermissionaccessDenied,
PublicreportPermissionaccessGranted,
PublicreportPermissionaccessUnselected,
PublicreportPermissionaccessWithOwner,
}
}
type PublicreportPermissionaccess string
func (e PublicreportPermissionaccess) String() string {
return string(e)
}
func (e PublicreportPermissionaccess) Valid() bool {
switch e {
case PublicreportPermissionaccessDenied,
PublicreportPermissionaccessGranted,
PublicreportPermissionaccessUnselected,
PublicreportPermissionaccessWithOwner:
return true
default:
return false
}
}
// useful when testing in other packages
func (e PublicreportPermissionaccess) All() []PublicreportPermissionaccess {
return AllPublicreportPermissionaccess()
}
func (e PublicreportPermissionaccess) MarshalText() ([]byte, error) {
return []byte(e), nil
}
func (e *PublicreportPermissionaccess) UnmarshalText(text []byte) error {
return e.Scan(text)
}
func (e PublicreportPermissionaccess) MarshalBinary() ([]byte, error) {
return []byte(e), nil
}
func (e *PublicreportPermissionaccess) UnmarshalBinary(data []byte) error {
return e.Scan(data)
}
func (e PublicreportPermissionaccess) Value() (driver.Value, error) {
return string(e), nil
}
func (e *PublicreportPermissionaccess) Scan(value any) error {
switch x := value.(type) {
case string:
*e = PublicreportPermissionaccess(x)
case []byte:
*e = PublicreportPermissionaccess(x)
case nil:
return fmt.Errorf("cannot nil into PublicreportPermissionaccess")
default:
return fmt.Errorf("cannot scan type %T: %v", value, value)
}
if !e.Valid() {
return fmt.Errorf("invalid PublicreportPermissionaccess value: %s", *e)
}
return nil
}
// Enum values for PublicreportPoolsourceduration
const (
PublicreportPoolsourcedurationNone PublicreportPoolsourceduration = "none"
@ -2138,14 +2156,16 @@ func (e *PublicreportReportstatustype) Scan(value any) error {
// Enum values for PublicreportReporttype
const (
PublicreportReporttypeNuisance PublicreportReporttype = "nuisance"
PublicreportReporttypeWater PublicreportReporttype = "water"
PublicreportReporttypeNuisance PublicreportReporttype = "nuisance"
PublicreportReporttypeWater PublicreportReporttype = "water"
PublicreportReporttypeCompliance PublicreportReporttype = "compliance"
)
func AllPublicreportReporttype() []PublicreportReporttype {
return []PublicreportReporttype{
PublicreportReporttypeNuisance,
PublicreportReporttypeWater,
PublicreportReporttypeCompliance,
}
}
@ -2158,7 +2178,8 @@ func (e PublicreportReporttype) String() string {
func (e PublicreportReporttype) Valid() bool {
switch e {
case PublicreportReporttypeNuisance,
PublicreportReporttypeWater:
PublicreportReporttypeWater,
PublicreportReporttypeCompliance:
return true
default:
return false

View file

@ -58,7 +58,7 @@ func SaveOrUpdateContainerRelate(ctx context.Context, org *models.Organization,
}
return []SqlParam{
Uint("p_objectid", row.ObjectID),
Int32("p_organization_id", org.ID),
Int32("p_organization_id", org.ID()),
UUID("p_globalid", row.GlobalID),
String("p_created_user", row.CreatedUser),
Timestamp("p_created_date", row.CreatedDate),

View file

@ -0,0 +1,42 @@
//
// Code generated by go-jet DO NOT EDIT.
//
// WARNING: Changes to this file may cause incorrect behavior
// and will be lost if the code is regenerated
//
package enum
import "github.com/Gleipnir-Technology/jet/postgres"
var Fieldtype = &struct {
EsriFieldTypeSmallInteger postgres.StringExpression
EsriFieldTypeInteger postgres.StringExpression
EsriFieldTypeSingle postgres.StringExpression
EsriFieldTypeDouble postgres.StringExpression
EsriFieldTypeString postgres.StringExpression
EsriFieldTypeDate postgres.StringExpression
EsriFieldTypeOID postgres.StringExpression
EsriFieldTypeGeometry postgres.StringExpression
EsriFieldTypeBlob postgres.StringExpression
EsriFieldTypeRaster postgres.StringExpression
EsriFieldTypeGUID postgres.StringExpression
EsriFieldTypeGlobalID postgres.StringExpression
EsriFieldTypeXML postgres.StringExpression
EsriFieldTypeBigInteger postgres.StringExpression
}{
EsriFieldTypeSmallInteger: postgres.NewEnumValue("esriFieldTypeSmallInteger"),
EsriFieldTypeInteger: postgres.NewEnumValue("esriFieldTypeInteger"),
EsriFieldTypeSingle: postgres.NewEnumValue("esriFieldTypeSingle"),
EsriFieldTypeDouble: postgres.NewEnumValue("esriFieldTypeDouble"),
EsriFieldTypeString: postgres.NewEnumValue("esriFieldTypeString"),
EsriFieldTypeDate: postgres.NewEnumValue("esriFieldTypeDate"),
EsriFieldTypeOID: postgres.NewEnumValue("esriFieldTypeOID"),
EsriFieldTypeGeometry: postgres.NewEnumValue("esriFieldTypeGeometry"),
EsriFieldTypeBlob: postgres.NewEnumValue("esriFieldTypeBlob"),
EsriFieldTypeRaster: postgres.NewEnumValue("esriFieldTypeRaster"),
EsriFieldTypeGUID: postgres.NewEnumValue("esriFieldTypeGUID"),
EsriFieldTypeGlobalID: postgres.NewEnumValue("esriFieldTypeGlobalID"),
EsriFieldTypeXML: postgres.NewEnumValue("esriFieldTypeXML"),
EsriFieldTypeBigInteger: postgres.NewEnumValue("esriFieldTypeBigInteger"),
}

View file

@ -0,0 +1,24 @@
//
// Code generated by go-jet DO NOT EDIT.
//
// WARNING: Changes to this file may cause incorrect behavior
// and will be lost if the code is regenerated
//
package enum
import "github.com/Gleipnir-Technology/jet/postgres"
var Mappingdestinationaddress = &struct {
Country postgres.StringExpression
Locality postgres.StringExpression
PostalCode postgres.StringExpression
Street postgres.StringExpression
Unit postgres.StringExpression
}{
Country: postgres.NewEnumValue("country"),
Locality: postgres.NewEnumValue("locality"),
PostalCode: postgres.NewEnumValue("postal_code"),
Street: postgres.NewEnumValue("street"),
Unit: postgres.NewEnumValue("unit"),
}

View file

@ -0,0 +1,18 @@
//
// Code generated by go-jet DO NOT EDIT.
//
// WARNING: Changes to this file may cause incorrect behavior
// and will be lost if the code is regenerated
//
package enum
import "github.com/Gleipnir-Technology/jet/postgres"
var Mappingdestinationparcel = &struct {
Apn postgres.StringExpression
Description postgres.StringExpression
}{
Apn: postgres.NewEnumValue("apn"),
Description: postgres.NewEnumValue("description"),
}

View file

@ -0,0 +1,19 @@
//
// Code generated by go-jet DO NOT EDIT.
//
// WARNING: Changes to this file may cause incorrect behavior
// and will be lost if the code is regenerated
//
package model
type Account struct {
ID string `sql:"primary_key"`
Name string
OrganizationID int32
URLFeatures *string
URLInsights *string
URLGeometry *string
URLNotebooks *string
URLTiles *string
}

View file

@ -0,0 +1,16 @@
//
// Code generated by go-jet DO NOT EDIT.
//
// WARNING: Changes to this file may cause incorrect behavior
// and will be lost if the code is regenerated
//
package model
type AddressMapping struct {
Destination Mappingdestinationaddress `sql:"primary_key"`
LayerFeatureServiceItemID string
LayerIndex int32
LayerFieldName string
OrganizationID int32 `sql:"primary_key"`
}

View file

@ -0,0 +1,97 @@
//
// Code generated by go-jet DO NOT EDIT.
//
// WARNING: Changes to this file may cause incorrect behavior
// and will be lost if the code is regenerated
//
package model
import "errors"
type Fieldtype string
const (
Fieldtype_EsriFieldTypeSmallInteger Fieldtype = "esriFieldTypeSmallInteger"
Fieldtype_EsriFieldTypeInteger Fieldtype = "esriFieldTypeInteger"
Fieldtype_EsriFieldTypeSingle Fieldtype = "esriFieldTypeSingle"
Fieldtype_EsriFieldTypeDouble Fieldtype = "esriFieldTypeDouble"
Fieldtype_EsriFieldTypeString Fieldtype = "esriFieldTypeString"
Fieldtype_EsriFieldTypeDate Fieldtype = "esriFieldTypeDate"
Fieldtype_EsriFieldTypeOID Fieldtype = "esriFieldTypeOID"
Fieldtype_EsriFieldTypeGeometry Fieldtype = "esriFieldTypeGeometry"
Fieldtype_EsriFieldTypeBlob Fieldtype = "esriFieldTypeBlob"
Fieldtype_EsriFieldTypeRaster Fieldtype = "esriFieldTypeRaster"
Fieldtype_EsriFieldTypeGUID Fieldtype = "esriFieldTypeGUID"
Fieldtype_EsriFieldTypeGlobalID Fieldtype = "esriFieldTypeGlobalID"
Fieldtype_EsriFieldTypeXML Fieldtype = "esriFieldTypeXML"
Fieldtype_EsriFieldTypeBigInteger Fieldtype = "esriFieldTypeBigInteger"
)
var FieldtypeAllValues = []Fieldtype{
Fieldtype_EsriFieldTypeSmallInteger,
Fieldtype_EsriFieldTypeInteger,
Fieldtype_EsriFieldTypeSingle,
Fieldtype_EsriFieldTypeDouble,
Fieldtype_EsriFieldTypeString,
Fieldtype_EsriFieldTypeDate,
Fieldtype_EsriFieldTypeOID,
Fieldtype_EsriFieldTypeGeometry,
Fieldtype_EsriFieldTypeBlob,
Fieldtype_EsriFieldTypeRaster,
Fieldtype_EsriFieldTypeGUID,
Fieldtype_EsriFieldTypeGlobalID,
Fieldtype_EsriFieldTypeXML,
Fieldtype_EsriFieldTypeBigInteger,
}
func (e *Fieldtype) Scan(value interface{}) error {
var enumValue string
switch val := value.(type) {
case string:
enumValue = val
case []byte:
enumValue = string(val)
default:
return errors.New("jet: Invalid scan value for AllTypesEnum enum. Enum value has to be of type string or []byte")
}
switch enumValue {
case "esriFieldTypeSmallInteger":
*e = Fieldtype_EsriFieldTypeSmallInteger
case "esriFieldTypeInteger":
*e = Fieldtype_EsriFieldTypeInteger
case "esriFieldTypeSingle":
*e = Fieldtype_EsriFieldTypeSingle
case "esriFieldTypeDouble":
*e = Fieldtype_EsriFieldTypeDouble
case "esriFieldTypeString":
*e = Fieldtype_EsriFieldTypeString
case "esriFieldTypeDate":
*e = Fieldtype_EsriFieldTypeDate
case "esriFieldTypeOID":
*e = Fieldtype_EsriFieldTypeOID
case "esriFieldTypeGeometry":
*e = Fieldtype_EsriFieldTypeGeometry
case "esriFieldTypeBlob":
*e = Fieldtype_EsriFieldTypeBlob
case "esriFieldTypeRaster":
*e = Fieldtype_EsriFieldTypeRaster
case "esriFieldTypeGUID":
*e = Fieldtype_EsriFieldTypeGUID
case "esriFieldTypeGlobalID":
*e = Fieldtype_EsriFieldTypeGlobalID
case "esriFieldTypeXML":
*e = Fieldtype_EsriFieldTypeXML
case "esriFieldTypeBigInteger":
*e = Fieldtype_EsriFieldTypeBigInteger
default:
return errors.New("jet: Invalid scan value '" + enumValue + "' for Fieldtype enum")
}
return nil
}
func (e Fieldtype) String() string {
return string(e)
}

View file

@ -0,0 +1,18 @@
//
// Code generated by go-jet DO NOT EDIT.
//
// WARNING: Changes to this file may cause incorrect behavior
// and will be lost if the code is regenerated
//
package model
import (
"github.com/twpayne/go-geom"
)
type Layer struct {
Extent geom.Bounds
FeatureServiceItemID string `sql:"primary_key"`
Index int32 `sql:"primary_key"`
}

View file

@ -0,0 +1,15 @@
//
// Code generated by go-jet DO NOT EDIT.
//
// WARNING: Changes to this file may cause incorrect behavior
// and will be lost if the code is regenerated
//
package model
type LayerField struct {
LayerFeatureServiceItemID string `sql:"primary_key"`
LayerIndex int32 `sql:"primary_key"`
Name string `sql:"primary_key"`
Type Fieldtype
}

View file

@ -0,0 +1,61 @@
//
// Code generated by go-jet DO NOT EDIT.
//
// WARNING: Changes to this file may cause incorrect behavior
// and will be lost if the code is regenerated
//
package model
import "errors"
type Mappingdestinationaddress string
const (
Mappingdestinationaddress_Country Mappingdestinationaddress = "country"
Mappingdestinationaddress_Locality Mappingdestinationaddress = "locality"
Mappingdestinationaddress_PostalCode Mappingdestinationaddress = "postal_code"
Mappingdestinationaddress_Street Mappingdestinationaddress = "street"
Mappingdestinationaddress_Unit Mappingdestinationaddress = "unit"
)
var MappingdestinationaddressAllValues = []Mappingdestinationaddress{
Mappingdestinationaddress_Country,
Mappingdestinationaddress_Locality,
Mappingdestinationaddress_PostalCode,
Mappingdestinationaddress_Street,
Mappingdestinationaddress_Unit,
}
func (e *Mappingdestinationaddress) Scan(value interface{}) error {
var enumValue string
switch val := value.(type) {
case string:
enumValue = val
case []byte:
enumValue = string(val)
default:
return errors.New("jet: Invalid scan value for AllTypesEnum enum. Enum value has to be of type string or []byte")
}
switch enumValue {
case "country":
*e = Mappingdestinationaddress_Country
case "locality":
*e = Mappingdestinationaddress_Locality
case "postal_code":
*e = Mappingdestinationaddress_PostalCode
case "street":
*e = Mappingdestinationaddress_Street
case "unit":
*e = Mappingdestinationaddress_Unit
default:
return errors.New("jet: Invalid scan value '" + enumValue + "' for Mappingdestinationaddress enum")
}
return nil
}
func (e Mappingdestinationaddress) String() string {
return string(e)
}

Some files were not shown because too many files have changed in this diff Show more