Los Angeles, CA
LA, CA USA - Managua, Nicaragua
From Peter Sanchez to ~netlandish/links-dev
Applied. To git@git.code.netlandish.com:~netlandish/links 44ef6e8..513b408 master -> master
From Peter Sanchez to ~netlandish/links-dev
--- One of the few places where we don't use the API for writing and it was missing a max length check. This is the biggest issue when not using the single point for storing data, you end up missing checks that already exist else where. core/import.go | 6 +++++- 1 file changed, 5 insertions(+), 1 deletion(-) diff --git a/core/import.go b/core/import.go index c908823..7e6aebb 100644 --- a/core/import.go +++ b/core/import.go @@ -283,8 +283,12 @@ func processOrgLinks(obj importObj, baseURLMap map[string]int, [message trimmed]
From Peter Sanchez to ~netlandish/links-dev
Applied. To git@git.code.netlandish.com:~netlandish/links 097aac3..44ef6e8 master -> master
From Peter Sanchez to ~netlandish/links-dev
--- Still pending some handlers to display audit logs but I wanted to ensure we get record keeping in place asap as more and more users register. accounts/userfetch.go | 65 +++++ api/graph/schema.graphqls | 6 +- api/graph/schema.resolvers.go | 459 +++++++++++++++++++++++++++++++--- models/audit_log.go | 65 +++++ models/models.go | 5 - 5 files changed, 554 insertions(+), 46 deletions(-) create mode 100644 models/audit_log.go diff --git a/accounts/userfetch.go b/accounts/userfetch.go index 6b8bba4..b509086 100644 [message trimmed]
From Peter Sanchez to ~netlandish/links-discuss
Hi all, I am happy to announce the release of links 0.1.4. https://git.code.netlandish.com/~netlandish/links/refs/0.1.4 If anyone is running this and wants to remove the existing sanitization on the scraped URL metadata you can compile and run the following script: https://paste.sr.ht/~petersanchez/95f653a54e7ad896472e26950bd88446cda974e1 Just make a `cmd/cleanup` directory and place that file in there. Then compile and run it:
From Peter Sanchez to ~netlandish/links-dev
Hi all, I am happy to announce the release of links 0.1.4. https://git.code.netlandish.com/~netlandish/links/refs/0.1.4 If anyone is running this and wants to remove the existing sanitization on the scraped URL metadata you can compile and run the following script: https://paste.sr.ht/~petersanchez/95f653a54e7ad896472e26950bd88446cda974e1 Just make a `cmd/cleanup` directory and place that file in there. Then compile and run it:
From Peter Sanchez to ~netlandish/links-dev
Applied. To git@git.code.netlandish.com:~netlandish/links b001fb8..f40c514 master -> master
From Peter Sanchez to ~netlandish/links-dev
Implements: https://todo.code.netlandish.com/~netlandish/links/93 --- You can use the following program to correct any existing entries in case anyone actually has this running anywhere. https://paste.sr.ht/~petersanchez/95f653a54e7ad896472e26950bd88446cda974e1 core/routes.go | 2 -- helpers.go | 12 ++++++------ templates/feed.html | 4 ++-- templates/link_list.html | 6 +++--- 4 files changed, 11 insertions(+), 13 deletions(-) diff --git a/core/routes.go b/core/routes.go [message trimmed]
From Peter Sanchez to ~netlandish/links-discuss
Hi all, I am happy to announce the release of links 0.1.3. https://git.code.netlandish.com/~netlandish/links/refs/0.1.3 Release highlights: # Added - Specify Sendy list-id when integrating per situation # Changed
From Peter Sanchez to ~netlandish/links-dev
Hi all, I am happy to announce the release of links 0.1.3. https://git.code.netlandish.com/~netlandish/links/refs/0.1.3 Release highlights: # Added - Specify Sendy list-id when integrating per situation # Changed