Compare commits

...

37 Commits

Author SHA1 Message Date
Thomas Pelletier 643c251c4b Fix panic when unmarshaling into a map twice (#854)
Fixes #851
2023-02-28 17:34:24 +01:00
Thomas Pelletier 8a416daa69 Fix error report of type mismatch on inline tables (#853)
Parser did not track the location of the faulty inline table in the
document, and unmarshaler tried to the use the non-raw data field of the
AST node, both resulting into a panic when generating the parser error.

Fixes #850
2023-02-28 17:06:49 +01:00
Andreas Deininger fcd9179b7d Fixes typos (#849) 2023-02-13 03:57:48 -08:00
Marty 9f5726004e Allow integers to be unmarshaled into floats (#841)
Co-authored-by: Marty <martin@windscribe.com>
2023-02-09 12:02:25 -05:00
Thomas Pelletier c4a2eef8a4 Fix go 1.20 version in github actions (#848)
YAML interprets 1.20 as 1.2 without explicitely being a string.
2023-02-09 12:00:14 -05:00
Thomas Pelletier b58c20aa49 Upgrade go 1.20 (#847)
Fixes #842
2023-02-08 18:59:58 -05:00
Cuong Manh Le 090cccf4ba Fix inline table first key value whitespace (#837)
Co-authored-by: Cuong Manh Le <cuong@windscribe.com>
2023-02-01 12:00:09 +01:00
DavidKorczynski 58a592bbf8 ci: add CIFuzz integration (#831)
Signed-off-by: David Korczynski <david@adalogics.com>
2022-11-21 18:51:48 -05:00
dependabot[bot] 94bd3ddcd6 build(deps): bump actions/setup-go from 2 to 3 (#820)
Bumps [actions/setup-go](https://github.com/actions/setup-go) from 2 to 3.
- [Release notes](https://github.com/actions/setup-go/releases)
- [Commits](https://github.com/actions/setup-go/compare/v2...v3)

---
updated-dependencies:
- dependency-name: actions/setup-go
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-11-09 16:14:19 -05:00
Thomas Pelletier e195b58fd0 Expose parser API as unstable (#827) 2022-11-09 16:12:39 -05:00
dependabot[bot] c83d001c6d build(deps): bump github.com/stretchr/testify from 1.8.0 to 1.8.1 (#825)
Bumps [github.com/stretchr/testify](https://github.com/stretchr/testify) from 1.8.0 to 1.8.1.
- [Release notes](https://github.com/stretchr/testify/releases)
- [Commits](https://github.com/stretchr/testify/compare/v1.8.0...v1.8.1)

---
updated-dependencies:
- dependency-name: github.com/stretchr/testify
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-10-30 13:53:41 -04:00
Johanan Idicula b9e3b9c370 refactor: Use typeMismatchError rather than raw string error (#826)
Uses the existing method to DRY up the error message generation, and decorates
with position index where needed. No behaviour is changed, but it allows for
further changes to make error messaging more specific.

Related to: pelletier/go-toml#806
2022-10-30 13:44:16 -04:00
Olivier Mengué d26887310c Reduce init time allocation when declaring types used for reflect (#821)
In declaration of types used for reflect, use
reflect.TypeOf((*T)).Elem() instead of reflect.TypeOf(T{}) to avoid
init-time allocations.

See related stdlib issue: https://github.com/golang/go/issues/55973
2022-10-07 14:28:37 +02:00
Thomas Pelletier 942841787a Fix reflect.Pointer backward compatibility (#813)
Though we don't officially support older versions of Go, this is an easy fix to
unblock people.

Fixes #812
2022-08-26 09:15:03 -04:00
Thomas Pelletier 28f1efc7d3 Decode: don't break on non-struct embed field (#810) 2022-08-22 18:39:11 -04:00
Piotr Buliński 7d69e4a728 Add missing '+build' comment to fuzz_test.go (#809) 2022-08-22 14:05:37 -04:00
Thomas Pelletier e46d245c09 Decode: don't crash on embedded nil pointers (#808)
Also has the perks of reducing the overhead of FindByIndex:

```
name                                old time/op    new time/op    delta
UnmarshalDataset/config-32            17.0ms ± 1%    17.0ms ± 1%    ~     (p=1.000 n=5+5)
UnmarshalDataset/canada-32            71.6ms ± 1%    71.4ms ± 1%    ~     (p=1.000 n=5+5)
UnmarshalDataset/citm_catalog-32      24.2ms ± 3%    23.5ms ± 2%  -3.03%  (p=0.032 n=5+5)
UnmarshalDataset/twitter-32           9.37ms ± 1%    9.09ms ± 2%  -2.97%  (p=0.032 n=5+5)
UnmarshalDataset/code-32              75.4ms ± 2%    74.9ms ± 0%    ~     (p=0.222 n=5+5)
UnmarshalDataset/example-32            147µs ±10%     136µs ± 1%  -7.14%  (p=0.008 n=5+5)
Unmarshal/SimpleDocument/struct-32     512ns ± 2%     500ns ± 0%  -2.35%  (p=0.008 n=5+5)
Unmarshal/SimpleDocument/map-32        721ns ± 2%     702ns ± 1%  -2.68%  (p=0.008 n=5+5)
Unmarshal/ReferenceFile/struct-32     40.1µs ± 0%    39.6µs ± 0%  -1.30%  (p=0.008 n=5+5)
Unmarshal/ReferenceFile/map-32        62.3µs ± 1%    60.6µs ± 0%  -2.83%  (p=0.008 n=5+5)
Unmarshal/HugoFrontMatter-32          10.8µs ± 1%    10.5µs ± 1%  -2.86%  (p=0.008 n=5+5)

name                                old speed      new speed      delta
UnmarshalDataset/config-32          61.8MB/s ± 1%  61.8MB/s ± 1%    ~     (p=1.000 n=5+5)
UnmarshalDataset/canada-32          30.8MB/s ± 1%  30.8MB/s ± 1%    ~     (p=1.000 n=5+5)
UnmarshalDataset/citm_catalog-32    23.0MB/s ± 3%  23.8MB/s ± 2%  +3.09%  (p=0.032 n=5+5)
UnmarshalDataset/twitter-32         47.2MB/s ± 1%  48.6MB/s ± 2%  +3.09%  (p=0.032 n=5+5)
UnmarshalDataset/code-32            35.6MB/s ± 2%  35.9MB/s ± 0%    ~     (p=0.222 n=5+5)
UnmarshalDataset/example-32         55.3MB/s ±10%  59.4MB/s ± 1%  +7.36%  (p=0.008 n=5+5)
Unmarshal/SimpleDocument/struct-32  21.5MB/s ± 2%  22.0MB/s ± 0%  +2.41%  (p=0.008 n=5+5)
Unmarshal/SimpleDocument/map-32     15.2MB/s ± 2%  15.7MB/s ± 1%  +2.74%  (p=0.008 n=5+5)
Unmarshal/ReferenceFile/struct-32    131MB/s ± 0%   132MB/s ± 0%  +1.31%  (p=0.008 n=5+5)
Unmarshal/ReferenceFile/map-32      84.1MB/s ± 1%  86.6MB/s ± 0%  +2.91%  (p=0.008 n=5+5)
Unmarshal/HugoFrontMatter-32        50.6MB/s ± 1%  52.1MB/s ± 1%  +2.93%  (p=0.008 n=5+5)

name                                old alloc/op   new alloc/op   delta
UnmarshalDataset/config-32            5.86MB ± 0%    5.86MB ± 0%    ~     (p=0.579 n=5+5)
UnmarshalDataset/canada-32            83.0MB ± 0%    83.0MB ± 0%    ~     (p=0.651 n=5+5)
UnmarshalDataset/citm_catalog-32      34.7MB ± 0%    34.7MB ± 0%    ~     (p=0.548 n=5+5)
UnmarshalDataset/twitter-32           12.7MB ± 0%    12.7MB ± 0%    ~     (p=1.000 n=5+5)
UnmarshalDataset/code-32              22.2MB ± 0%    22.2MB ± 0%    ~     (p=0.841 n=5+5)
UnmarshalDataset/example-32            186kB ± 0%     186kB ± 0%    ~     (p=0.111 n=5+5)
Unmarshal/SimpleDocument/struct-32      805B ± 0%      805B ± 0%    ~     (all equal)
Unmarshal/SimpleDocument/map-32       1.13kB ± 0%    1.13kB ± 0%    ~     (all equal)
Unmarshal/ReferenceFile/struct-32     20.9kB ± 0%    20.9kB ± 0%    ~     (p=0.643 n=5+5)
Unmarshal/ReferenceFile/map-32        38.3kB ± 0%    38.3kB ± 0%    ~     (p=0.397 n=5+5)
Unmarshal/HugoFrontMatter-32          7.44kB ± 0%    7.44kB ± 0%    ~     (all equal)

name                                old allocs/op  new allocs/op  delta
UnmarshalDataset/config-32              227k ± 0%      227k ± 0%    ~     (p=1.000 n=5+5)
UnmarshalDataset/canada-32              782k ± 0%      782k ± 0%    ~     (all equal)
UnmarshalDataset/citm_catalog-32        192k ± 0%      192k ± 0%    ~     (p=0.968 n=4+5)
UnmarshalDataset/twitter-32            56.9k ± 0%     56.9k ± 0%    ~     (p=0.429 n=4+5)
UnmarshalDataset/code-32               1.05M ± 0%     1.05M ± 0%    ~     (p=0.556 n=4+5)
UnmarshalDataset/example-32            1.36k ± 0%     1.36k ± 0%    ~     (all equal)
Unmarshal/SimpleDocument/struct-32      9.00 ± 0%      9.00 ± 0%    ~     (all equal)
Unmarshal/SimpleDocument/map-32         13.0 ± 0%      13.0 ± 0%    ~     (all equal)
Unmarshal/ReferenceFile/struct-32        183 ± 0%       183 ± 0%    ~     (all equal)
Unmarshal/ReferenceFile/map-32           642 ± 0%       642 ± 0%    ~     (all equal)
Unmarshal/HugoFrontMatter-32             141 ± 0%       141 ± 0%    ~     (all equal)
```

Fixes #807
2022-08-20 21:24:03 -04:00
Thomas Pelletier 7baa23f493 Decode: error on array table mismatched type (#804)
Prevent the decoder from continuing if it encounters a type it cannot decode an
array table into.

Fixes #799
2022-08-15 16:38:07 -04:00
Thomas Pelletier 2d8433b69e Encode: don't inherit omitempty (#803)
Fixes #786.
2022-08-15 11:29:46 -04:00
Thomas Pelletier 67bc5422f3 Go 1.19 (#802) 2022-08-15 10:56:33 -04:00
Thomas Pelletier fb6d1d6c2b Marshal: define and fix newlines behavior when using omitempty (#798)
Ref #786
2022-07-24 15:40:20 -04:00
dependabot[bot] d017a6dc89 build(deps): bump github.com/stretchr/testify from 1.7.5 to 1.8.0 (#795) 2022-06-29 09:51:28 -04:00
dependabot[bot] d6d3196163 build(deps): bump github.com/stretchr/testify from 1.7.4 to 1.7.5 (#794) 2022-06-24 12:49:56 -04:00
dependabot[bot] 41718a6db3 build(deps): bump github.com/stretchr/testify from 1.7.2 to 1.7.4 (#793)
Bumps [github.com/stretchr/testify](https://github.com/stretchr/testify) from 1.7.2 to 1.7.4.
- [Release notes](https://github.com/stretchr/testify/releases)
- [Commits](https://github.com/stretchr/testify/compare/v1.7.2...v1.7.4)

---
updated-dependencies:
- dependency-name: github.com/stretchr/testify
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-06-21 08:32:13 -04:00
Thomas Pelletier 216628222f Build arm + arm64 binaries for linux and windows (#790)
* Build arm + arm64 binaries for linux and windows

* Type MaxInt64 to avoid overflow on 32 bits arch

On a 32 bits arch, math.MaxIn64 is interpreted as an int, and therefore
overflows. This causes compilation to build on those platforms.
2022-06-08 18:05:42 -04:00
dependabot[bot] 322e0b15d2 build(deps): bump github.com/stretchr/testify from 1.7.1 to 1.7.2 (#788)
Bumps [github.com/stretchr/testify](https://github.com/stretchr/testify) from 1.7.1 to 1.7.2.
- [Release notes](https://github.com/stretchr/testify/releases)
- [Commits](https://github.com/stretchr/testify/compare/v1.7.1...v1.7.2)

---
updated-dependencies:
- dependency-name: github.com/stretchr/testify
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-06-08 17:44:59 -04:00
Thomas Pelletier 85bfc0ed51 Encode: add bound check for uint64 > math.Int64 (#785)
As brought up on #782, there is an asymetry between numbers go-toml can encode
and decode. Specifically, unsigned numbers larger than `math.Int64` could be
encoded but not decoded.

We considered two options: allow decoding of those numbers, or prevent those
numbers to be decoded.

Ultimately we decided to narrow the range of numbers to be encoded. The TOML
specification only requires parsers to support at least the 64 bits integer
range. Allowing larger numbers would create non-standard TOML documents, which
may not be readable (at best) by other implementations. It is a safer default to
generate documents valid by default. People who wish to deal with larger numbers
can provide a custom type implementing `encoding.TextMarshaler`.

Refs #781
2022-05-31 22:10:41 -04:00
dependabot[bot] 295a720dfb build(deps): bump goreleaser/goreleaser-action from 2 to 3 (#783)
Bumps [goreleaser/goreleaser-action](https://github.com/goreleaser/goreleaser-action) from 2 to 3.
- [Release notes](https://github.com/goreleaser/goreleaser-action/releases)
- [Commits](https://github.com/goreleaser/goreleaser-action/compare/v2...v3)

---
updated-dependencies:
- dependency-name: goreleaser/goreleaser-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-05-23 15:05:00 +02:00
Thomas Pelletier 0a422e3dbd Decoder: check max uint on 32 bit platforms (#778)
Fixes #777
2022-05-10 15:43:26 +02:00
Thomas Pelletier 627dade0c7 Encode: support comment on array tables (#776)
Fixes #774
2022-05-10 15:17:36 +02:00
Thomas Pelletier b2e0231cc9 Encode: fix multiline comment (#775)
Fixes #768
2022-05-10 14:53:26 +02:00
dependabot[bot] ba95863cd3 build(deps): bump docker/login-action from 1 to 2 (#771)
Bumps [docker/login-action](https://github.com/docker/login-action) from 1 to 2.
- [Release notes](https://github.com/docker/login-action/releases)
- [Commits](https://github.com/docker/login-action/compare/v1...v2)

---
updated-dependencies:
- dependency-name: docker/login-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-05-09 18:59:02 +02:00
Erik Hartwig db679df765 Typo in README.md fix (#770)
Change "document was not prevent in the target structure" to "document was not present in the target structure".
2022-05-09 18:46:46 +02:00
Thomas Pelletier c5ca2c682b Fix embedded struct with explicit field name (#773)
Fixes #772
2022-05-09 18:45:02 +02:00
Thomas Pelletier ed80712cb4 Copy version policy from v1 2022-04-28 11:55:39 -04:00
dependabot[bot] b24772942d build(deps): bump github/codeql-action from 1 to 2 (#764)
Bumps [github/codeql-action](https://github.com/github/codeql-action) from 1 to 2.
- [Release notes](https://github.com/github/codeql-action/releases)
- [Changelog](https://github.com/github/codeql-action/blob/main/CHANGELOG.md)
- [Commits](https://github.com/github/codeql-action/compare/v1...v2)

---
updated-dependencies:
- dependency-name: github/codeql-action
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-04-28 08:24:14 -04:00
dependabot[bot] 9501a05ed7 build(deps): bump actions/checkout from 2 to 3 (#765)
Bumps [actions/checkout](https://github.com/actions/checkout) from 2 to 3.
- [Release notes](https://github.com/actions/checkout/releases)
- [Changelog](https://github.com/actions/checkout/blob/main/CHANGELOG.md)
- [Commits](https://github.com/actions/checkout/compare/v2...v3)

---
updated-dependencies:
- dependency-name: actions/checkout
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2022-04-28 08:23:56 -04:00
43 changed files with 1339 additions and 752 deletions
+26
View File
@@ -0,0 +1,26 @@
name: CIFuzz
on: [pull_request]
jobs:
Fuzzing:
runs-on: ubuntu-latest
steps:
- name: Build Fuzzers
id: build
uses: google/oss-fuzz/infra/cifuzz/actions/build_fuzzers@master
with:
oss-fuzz-project-name: 'go-toml'
dry-run: false
language: go
- name: Run Fuzzers
uses: google/oss-fuzz/infra/cifuzz/actions/run_fuzzers@master
with:
oss-fuzz-project-name: 'go-toml'
fuzz-seconds: 300
dry-run: false
language: go
- name: Upload Crash
uses: actions/upload-artifact@v3
if: failure() && steps.build.outcome == 'success'
with:
name: artifacts
path: ./out/artifacts
+4 -4
View File
@@ -35,11 +35,11 @@ jobs:
steps: steps:
- name: Checkout repository - name: Checkout repository
uses: actions/checkout@v2 uses: actions/checkout@v3
# Initializes the CodeQL tools for scanning. # Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL - name: Initialize CodeQL
uses: github/codeql-action/init@v1 uses: github/codeql-action/init@v2
with: with:
languages: ${{ matrix.language }} languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file. # If you wish to specify custom queries, you can do so here or in a config file.
@@ -50,7 +50,7 @@ jobs:
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java). # Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
# If this step fails, then you should remove it and run the build manually (see below) # If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild - name: Autobuild
uses: github/codeql-action/autobuild@v1 uses: github/codeql-action/autobuild@v2
# ️ Command-line programs to run using the OS shell. # ️ Command-line programs to run using the OS shell.
# 📚 https://git.io/JvXDl # 📚 https://git.io/JvXDl
@@ -64,4 +64,4 @@ jobs:
# make release # make release
- name: Perform CodeQL Analysis - name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v1 uses: github/codeql-action/analyze@v2
+3 -3
View File
@@ -9,12 +9,12 @@ jobs:
runs-on: "ubuntu-latest" runs-on: "ubuntu-latest"
name: report name: report
steps: steps:
- uses: actions/checkout@master - uses: actions/checkout@v3
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Setup go - name: Setup go
uses: actions/setup-go@master uses: actions/setup-go@v3
with: with:
go-version: 1.18 go-version: "1.20"
- name: Run tests with coverage - name: Run tests with coverage
run: ./ci.sh coverage -d "${GITHUB_BASE_REF-HEAD}" run: ./ci.sh coverage -d "${GITHUB_BASE_REF-HEAD}"
+5 -5
View File
@@ -16,21 +16,21 @@ jobs:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- name: Checkout - name: Checkout
uses: actions/checkout@v2 uses: actions/checkout@v3
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Set up Go - name: Set up Go
uses: actions/setup-go@v2 uses: actions/setup-go@v3
with: with:
go-version: 1.18 go-version: "1.20"
- name: Login to GitHub Container Registry - name: Login to GitHub Container Registry
uses: docker/login-action@v1 uses: docker/login-action@v2
with: with:
registry: ghcr.io registry: ghcr.io
username: ${{ github.actor }} username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }} password: ${{ secrets.GITHUB_TOKEN }}
- name: Run GoReleaser - name: Run GoReleaser
uses: goreleaser/goreleaser-action@v2 uses: goreleaser/goreleaser-action@v3
with: with:
distribution: goreleaser distribution: goreleaser
version: latest version: latest
+3 -3
View File
@@ -12,15 +12,15 @@ jobs:
strategy: strategy:
matrix: matrix:
os: [ 'ubuntu-latest', 'windows-latest', 'macos-latest'] os: [ 'ubuntu-latest', 'windows-latest', 'macos-latest']
go: [ '1.17', '1.18' ] go: [ '1.19', '1.20' ]
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
name: ${{ matrix.go }}/${{ matrix.os }} name: ${{ matrix.go }}/${{ matrix.os }}
steps: steps:
- uses: actions/checkout@master - uses: actions/checkout@v3
with: with:
fetch-depth: 0 fetch-depth: 0
- name: Setup go ${{ matrix.go }} - name: Setup go ${{ matrix.go }}
uses: actions/setup-go@master uses: actions/setup-go@v3
with: with:
go-version: ${{ matrix.go }} go-version: ${{ matrix.go }}
- name: Run unit tests - name: Run unit tests
+12
View File
@@ -16,7 +16,11 @@ builds:
mod_timestamp: '{{ .CommitTimestamp }}' mod_timestamp: '{{ .CommitTimestamp }}'
targets: targets:
- linux_amd64 - linux_amd64
- linux_arm64
- linux_arm
- windows_amd64 - windows_amd64
- windows_arm64
- windows_arm
- darwin_amd64 - darwin_amd64
- darwin_arm64 - darwin_arm64
- id: tomljson - id: tomljson
@@ -31,7 +35,11 @@ builds:
mod_timestamp: '{{ .CommitTimestamp }}' mod_timestamp: '{{ .CommitTimestamp }}'
targets: targets:
- linux_amd64 - linux_amd64
- linux_arm64
- linux_arm
- windows_amd64 - windows_amd64
- windows_arm64
- windows_arm
- darwin_amd64 - darwin_amd64
- darwin_arm64 - darwin_arm64
- id: jsontoml - id: jsontoml
@@ -46,7 +54,11 @@ builds:
mod_timestamp: '{{ .CommitTimestamp }}' mod_timestamp: '{{ .CommitTimestamp }}'
targets: targets:
- linux_amd64 - linux_amd64
- linux_arm64
- linux_arm
- windows_amd64 - windows_amd64
- windows_arm64
- windows_arm
- darwin_amd64 - darwin_amd64
- darwin_arm64 - darwin_arm64
universal_binaries: universal_binaries:
+19 -1
View File
@@ -38,7 +38,7 @@ operations should not be shockingly slow. See [benchmarks](#benchmarks).
### Strict mode ### Strict mode
`Decoder` can be set to "strict mode", which makes it error when some parts of `Decoder` can be set to "strict mode", which makes it error when some parts of
the TOML document was not prevent in the target structure. This is a great way the TOML document was not present in the target structure. This is a great way
to check for typos. [See example in the documentation][strict]. to check for typos. [See example in the documentation][strict].
[strict]: https://pkg.go.dev/github.com/pelletier/go-toml/v2#example-Decoder.DisallowUnknownFields [strict]: https://pkg.go.dev/github.com/pelletier/go-toml/v2#example-Decoder.DisallowUnknownFields
@@ -140,6 +140,17 @@ fmt.Println(string(b))
[marshal]: https://pkg.go.dev/github.com/pelletier/go-toml/v2#Marshal [marshal]: https://pkg.go.dev/github.com/pelletier/go-toml/v2#Marshal
## Unstable API
This API does not yet follow the backward compatibility guarantees of this
library. They provide early access to features that may have rough edges or an
API subject to change.
### Parser
Parser is the unstable API that allows iterative parsing of a TOML document at
the AST level. See https://pkg.go.dev/github.com/pelletier/go-toml/v2/unstable.
## Benchmarks ## Benchmarks
Execution time speedup compared to other Go TOML libraries: Execution time speedup compared to other Go TOML libraries:
@@ -540,6 +551,13 @@ complete solutions exist out there.
[query]: https://github.com/pelletier/go-toml/tree/f99d6bbca119636aeafcf351ee52b3d202782627/query [query]: https://github.com/pelletier/go-toml/tree/f99d6bbca119636aeafcf351ee52b3d202782627/query
[dasel]: https://github.com/TomWright/dasel [dasel]: https://github.com/TomWright/dasel
## Versioning
Go-toml follows [Semantic Versioning](https://semver.org). The supported version
of [TOML](https://github.com/toml-lang/toml) is indicated at the beginning of
this document. The last two major versions of Go are supported
(see [Go Release Policy](https://golang.org/doc/devel/release.html#policy)).
## License ## License
The MIT License (MIT). Read [LICENSE](LICENSE). The MIT License (MIT). Read [LICENSE](LICENSE).
+5 -5
View File
@@ -1,20 +1,20 @@
// Package jsontoml is a program that converts JSON to TOML. // Package jsontoml is a program that converts JSON to TOML.
// //
// Usage // # Usage
// //
// Reading from stdin: // Reading from stdin:
// //
// cat file.json | jsontoml > file.toml // cat file.json | jsontoml > file.toml
// //
// Reading from a file: // Reading from a file:
// //
// jsontoml file.json > file.toml // jsontoml file.json > file.toml
// //
// Installation // # Installation
// //
// Using Go: // Using Go:
// //
// go install github.com/pelletier/go-toml/v2/cmd/jsontoml@latest // go install github.com/pelletier/go-toml/v2/cmd/jsontoml@latest
package main package main
import ( import (
-1
View File
@@ -26,7 +26,6 @@ func TestConvert(t *testing.T) {
}`, }`,
expected: `[mytoml] expected: `[mytoml]
a = 42.0 a = 42.0
`, `,
}, },
{ {
+5 -5
View File
@@ -1,20 +1,20 @@
// Package tomljson is a program that converts TOML to JSON. // Package tomljson is a program that converts TOML to JSON.
// //
// Usage // # Usage
// //
// Reading from stdin: // Reading from stdin:
// //
// cat file.toml | tomljson > file.json // cat file.toml | tomljson > file.json
// //
// Reading from a file: // Reading from a file:
// //
// tomljson file.toml > file.json // tomljson file.toml > file.json
// //
// Installation // # Installation
// //
// Using Go: // Using Go:
// //
// go install github.com/pelletier/go-toml/v2/cmd/tomljson@latest // go install github.com/pelletier/go-toml/v2/cmd/tomljson@latest
package main package main
import ( import (
+5 -5
View File
@@ -1,20 +1,20 @@
// Package tomll is a linter program for TOML. // Package tomll is a linter program for TOML.
// //
// Usage // # Usage
// //
// Reading from stdin, writing to stdout: // Reading from stdin, writing to stdout:
// //
// cat file.toml | tomll // cat file.toml | tomll
// //
// Reading and updating a list of files in place: // Reading and updating a list of files in place:
// //
// tomll a.toml b.toml c.toml // tomll a.toml b.toml c.toml
// //
// Installation // # Installation
// //
// Using Go: // Using Go:
// //
// go install github.com/pelletier/go-toml/v2/cmd/tomll@latest // go install github.com/pelletier/go-toml/v2/cmd/tomll@latest
package main package main
import ( import (
-1
View File
@@ -23,7 +23,6 @@ mytoml.a = 42.0
`, `,
expected: `[mytoml] expected: `[mytoml]
a = 42.0 a = 42.0
`, `,
}, },
{ {
+1 -1
View File
@@ -3,7 +3,7 @@
// //
// Within the go-toml package, run `go generate`. Otherwise, use: // Within the go-toml package, run `go generate`. Otherwise, use:
// //
// go run github.com/pelletier/go-toml/cmd/tomltestgen -o toml_testgen_test.go // go run github.com/pelletier/go-toml/cmd/tomltestgen -o toml_testgen_test.go
package main package main
import ( import (
+47 -41
View File
@@ -5,6 +5,8 @@ import (
"math" "math"
"strconv" "strconv"
"time" "time"
"github.com/pelletier/go-toml/v2/unstable"
) )
func parseInteger(b []byte) (int64, error) { func parseInteger(b []byte) (int64, error) {
@@ -32,7 +34,7 @@ func parseLocalDate(b []byte) (LocalDate, error) {
var date LocalDate var date LocalDate
if len(b) != 10 || b[4] != '-' || b[7] != '-' { if len(b) != 10 || b[4] != '-' || b[7] != '-' {
return date, newDecodeError(b, "dates are expected to have the format YYYY-MM-DD") return date, unstable.NewParserError(b, "dates are expected to have the format YYYY-MM-DD")
} }
var err error var err error
@@ -53,7 +55,7 @@ func parseLocalDate(b []byte) (LocalDate, error) {
} }
if !isValidDate(date.Year, date.Month, date.Day) { if !isValidDate(date.Year, date.Month, date.Day) {
return LocalDate{}, newDecodeError(b, "impossible date") return LocalDate{}, unstable.NewParserError(b, "impossible date")
} }
return date, nil return date, nil
@@ -64,7 +66,7 @@ func parseDecimalDigits(b []byte) (int, error) {
for i, c := range b { for i, c := range b {
if c < '0' || c > '9' { if c < '0' || c > '9' {
return 0, newDecodeError(b[i:i+1], "expected digit (0-9)") return 0, unstable.NewParserError(b[i:i+1], "expected digit (0-9)")
} }
v *= 10 v *= 10
v += int(c - '0') v += int(c - '0')
@@ -97,7 +99,7 @@ func parseDateTime(b []byte) (time.Time, error) {
} else { } else {
const dateTimeByteLen = 6 const dateTimeByteLen = 6
if len(b) != dateTimeByteLen { if len(b) != dateTimeByteLen {
return time.Time{}, newDecodeError(b, "invalid date-time timezone") return time.Time{}, unstable.NewParserError(b, "invalid date-time timezone")
} }
var direction int var direction int
switch b[0] { switch b[0] {
@@ -106,11 +108,11 @@ func parseDateTime(b []byte) (time.Time, error) {
case '+': case '+':
direction = +1 direction = +1
default: default:
return time.Time{}, newDecodeError(b[:1], "invalid timezone offset character") return time.Time{}, unstable.NewParserError(b[:1], "invalid timezone offset character")
} }
if b[3] != ':' { if b[3] != ':' {
return time.Time{}, newDecodeError(b[3:4], "expected a : separator") return time.Time{}, unstable.NewParserError(b[3:4], "expected a : separator")
} }
hours, err := parseDecimalDigits(b[1:3]) hours, err := parseDecimalDigits(b[1:3])
@@ -118,7 +120,7 @@ func parseDateTime(b []byte) (time.Time, error) {
return time.Time{}, err return time.Time{}, err
} }
if hours > 23 { if hours > 23 {
return time.Time{}, newDecodeError(b[:1], "invalid timezone offset hours") return time.Time{}, unstable.NewParserError(b[:1], "invalid timezone offset hours")
} }
minutes, err := parseDecimalDigits(b[4:6]) minutes, err := parseDecimalDigits(b[4:6])
@@ -126,7 +128,7 @@ func parseDateTime(b []byte) (time.Time, error) {
return time.Time{}, err return time.Time{}, err
} }
if minutes > 59 { if minutes > 59 {
return time.Time{}, newDecodeError(b[:1], "invalid timezone offset minutes") return time.Time{}, unstable.NewParserError(b[:1], "invalid timezone offset minutes")
} }
seconds := direction * (hours*3600 + minutes*60) seconds := direction * (hours*3600 + minutes*60)
@@ -139,7 +141,7 @@ func parseDateTime(b []byte) (time.Time, error) {
} }
if len(b) > 0 { if len(b) > 0 {
return time.Time{}, newDecodeError(b, "extra bytes at the end of the timezone") return time.Time{}, unstable.NewParserError(b, "extra bytes at the end of the timezone")
} }
t := time.Date( t := time.Date(
@@ -160,7 +162,7 @@ func parseLocalDateTime(b []byte) (LocalDateTime, []byte, error) {
const localDateTimeByteMinLen = 11 const localDateTimeByteMinLen = 11
if len(b) < localDateTimeByteMinLen { if len(b) < localDateTimeByteMinLen {
return dt, nil, newDecodeError(b, "local datetimes are expected to have the format YYYY-MM-DDTHH:MM:SS[.NNNNNNNNN]") return dt, nil, unstable.NewParserError(b, "local datetimes are expected to have the format YYYY-MM-DDTHH:MM:SS[.NNNNNNNNN]")
} }
date, err := parseLocalDate(b[:10]) date, err := parseLocalDate(b[:10])
@@ -171,7 +173,7 @@ func parseLocalDateTime(b []byte) (LocalDateTime, []byte, error) {
sep := b[10] sep := b[10]
if sep != 'T' && sep != ' ' && sep != 't' { if sep != 'T' && sep != ' ' && sep != 't' {
return dt, nil, newDecodeError(b[10:11], "datetime separator is expected to be T or a space") return dt, nil, unstable.NewParserError(b[10:11], "datetime separator is expected to be T or a space")
} }
t, rest, err := parseLocalTime(b[11:]) t, rest, err := parseLocalTime(b[11:])
@@ -195,7 +197,7 @@ func parseLocalTime(b []byte) (LocalTime, []byte, error) {
// check if b matches to have expected format HH:MM:SS[.NNNNNN] // check if b matches to have expected format HH:MM:SS[.NNNNNN]
const localTimeByteLen = 8 const localTimeByteLen = 8
if len(b) < localTimeByteLen { if len(b) < localTimeByteLen {
return t, nil, newDecodeError(b, "times are expected to have the format HH:MM:SS[.NNNNNN]") return t, nil, unstable.NewParserError(b, "times are expected to have the format HH:MM:SS[.NNNNNN]")
} }
var err error var err error
@@ -206,10 +208,10 @@ func parseLocalTime(b []byte) (LocalTime, []byte, error) {
} }
if t.Hour > 23 { if t.Hour > 23 {
return t, nil, newDecodeError(b[0:2], "hour cannot be greater 23") return t, nil, unstable.NewParserError(b[0:2], "hour cannot be greater 23")
} }
if b[2] != ':' { if b[2] != ':' {
return t, nil, newDecodeError(b[2:3], "expecting colon between hours and minutes") return t, nil, unstable.NewParserError(b[2:3], "expecting colon between hours and minutes")
} }
t.Minute, err = parseDecimalDigits(b[3:5]) t.Minute, err = parseDecimalDigits(b[3:5])
@@ -217,10 +219,10 @@ func parseLocalTime(b []byte) (LocalTime, []byte, error) {
return t, nil, err return t, nil, err
} }
if t.Minute > 59 { if t.Minute > 59 {
return t, nil, newDecodeError(b[3:5], "minutes cannot be greater 59") return t, nil, unstable.NewParserError(b[3:5], "minutes cannot be greater 59")
} }
if b[5] != ':' { if b[5] != ':' {
return t, nil, newDecodeError(b[5:6], "expecting colon between minutes and seconds") return t, nil, unstable.NewParserError(b[5:6], "expecting colon between minutes and seconds")
} }
t.Second, err = parseDecimalDigits(b[6:8]) t.Second, err = parseDecimalDigits(b[6:8])
@@ -229,7 +231,7 @@ func parseLocalTime(b []byte) (LocalTime, []byte, error) {
} }
if t.Second > 60 { if t.Second > 60 {
return t, nil, newDecodeError(b[6:8], "seconds cannot be greater 60") return t, nil, unstable.NewParserError(b[6:8], "seconds cannot be greater 60")
} }
b = b[8:] b = b[8:]
@@ -242,7 +244,7 @@ func parseLocalTime(b []byte) (LocalTime, []byte, error) {
for i, c := range b[1:] { for i, c := range b[1:] {
if !isDigit(c) { if !isDigit(c) {
if i == 0 { if i == 0 {
return t, nil, newDecodeError(b[0:1], "need at least one digit after fraction point") return t, nil, unstable.NewParserError(b[0:1], "need at least one digit after fraction point")
} }
break break
} }
@@ -266,7 +268,7 @@ func parseLocalTime(b []byte) (LocalTime, []byte, error) {
} }
if precision == 0 { if precision == 0 {
return t, nil, newDecodeError(b[:1], "nanoseconds need at least one digit") return t, nil, unstable.NewParserError(b[:1], "nanoseconds need at least one digit")
} }
t.Nanosecond = frac * nspow[precision] t.Nanosecond = frac * nspow[precision]
@@ -289,24 +291,24 @@ func parseFloat(b []byte) (float64, error) {
} }
if cleaned[0] == '.' { if cleaned[0] == '.' {
return 0, newDecodeError(b, "float cannot start with a dot") return 0, unstable.NewParserError(b, "float cannot start with a dot")
} }
if cleaned[len(cleaned)-1] == '.' { if cleaned[len(cleaned)-1] == '.' {
return 0, newDecodeError(b, "float cannot end with a dot") return 0, unstable.NewParserError(b, "float cannot end with a dot")
} }
dotAlreadySeen := false dotAlreadySeen := false
for i, c := range cleaned { for i, c := range cleaned {
if c == '.' { if c == '.' {
if dotAlreadySeen { if dotAlreadySeen {
return 0, newDecodeError(b[i:i+1], "float can have at most one decimal point") return 0, unstable.NewParserError(b[i:i+1], "float can have at most one decimal point")
} }
if !isDigit(cleaned[i-1]) { if !isDigit(cleaned[i-1]) {
return 0, newDecodeError(b[i-1:i+1], "float decimal point must be preceded by a digit") return 0, unstable.NewParserError(b[i-1:i+1], "float decimal point must be preceded by a digit")
} }
if !isDigit(cleaned[i+1]) { if !isDigit(cleaned[i+1]) {
return 0, newDecodeError(b[i:i+2], "float decimal point must be followed by a digit") return 0, unstable.NewParserError(b[i:i+2], "float decimal point must be followed by a digit")
} }
dotAlreadySeen = true dotAlreadySeen = true
} }
@@ -317,12 +319,12 @@ func parseFloat(b []byte) (float64, error) {
start = 1 start = 1
} }
if cleaned[start] == '0' && isDigit(cleaned[start+1]) { if cleaned[start] == '0' && isDigit(cleaned[start+1]) {
return 0, newDecodeError(b, "float integer part cannot have leading zeroes") return 0, unstable.NewParserError(b, "float integer part cannot have leading zeroes")
} }
f, err := strconv.ParseFloat(string(cleaned), 64) f, err := strconv.ParseFloat(string(cleaned), 64)
if err != nil { if err != nil {
return 0, newDecodeError(b, "unable to parse float: %w", err) return 0, unstable.NewParserError(b, "unable to parse float: %w", err)
} }
return f, nil return f, nil
@@ -336,7 +338,7 @@ func parseIntHex(b []byte) (int64, error) {
i, err := strconv.ParseInt(string(cleaned), 16, 64) i, err := strconv.ParseInt(string(cleaned), 16, 64)
if err != nil { if err != nil {
return 0, newDecodeError(b, "couldn't parse hexadecimal number: %w", err) return 0, unstable.NewParserError(b, "couldn't parse hexadecimal number: %w", err)
} }
return i, nil return i, nil
@@ -350,7 +352,7 @@ func parseIntOct(b []byte) (int64, error) {
i, err := strconv.ParseInt(string(cleaned), 8, 64) i, err := strconv.ParseInt(string(cleaned), 8, 64)
if err != nil { if err != nil {
return 0, newDecodeError(b, "couldn't parse octal number: %w", err) return 0, unstable.NewParserError(b, "couldn't parse octal number: %w", err)
} }
return i, nil return i, nil
@@ -364,7 +366,7 @@ func parseIntBin(b []byte) (int64, error) {
i, err := strconv.ParseInt(string(cleaned), 2, 64) i, err := strconv.ParseInt(string(cleaned), 2, 64)
if err != nil { if err != nil {
return 0, newDecodeError(b, "couldn't parse binary number: %w", err) return 0, unstable.NewParserError(b, "couldn't parse binary number: %w", err)
} }
return i, nil return i, nil
@@ -387,12 +389,12 @@ func parseIntDec(b []byte) (int64, error) {
} }
if len(cleaned) > startIdx+1 && cleaned[startIdx] == '0' { if len(cleaned) > startIdx+1 && cleaned[startIdx] == '0' {
return 0, newDecodeError(b, "leading zero not allowed on decimal number") return 0, unstable.NewParserError(b, "leading zero not allowed on decimal number")
} }
i, err := strconv.ParseInt(string(cleaned), 10, 64) i, err := strconv.ParseInt(string(cleaned), 10, 64)
if err != nil { if err != nil {
return 0, newDecodeError(b, "couldn't parse decimal number: %w", err) return 0, unstable.NewParserError(b, "couldn't parse decimal number: %w", err)
} }
return i, nil return i, nil
@@ -409,11 +411,11 @@ func checkAndRemoveUnderscoresIntegers(b []byte) ([]byte, error) {
} }
if b[start] == '_' { if b[start] == '_' {
return nil, newDecodeError(b[start:start+1], "number cannot start with underscore") return nil, unstable.NewParserError(b[start:start+1], "number cannot start with underscore")
} }
if b[len(b)-1] == '_' { if b[len(b)-1] == '_' {
return nil, newDecodeError(b[len(b)-1:], "number cannot end with underscore") return nil, unstable.NewParserError(b[len(b)-1:], "number cannot end with underscore")
} }
// fast path // fast path
@@ -435,7 +437,7 @@ func checkAndRemoveUnderscoresIntegers(b []byte) ([]byte, error) {
c := b[i] c := b[i]
if c == '_' { if c == '_' {
if !before { if !before {
return nil, newDecodeError(b[i-1:i+1], "number must have at least one digit between underscores") return nil, unstable.NewParserError(b[i-1:i+1], "number must have at least one digit between underscores")
} }
before = false before = false
} else { } else {
@@ -449,11 +451,11 @@ func checkAndRemoveUnderscoresIntegers(b []byte) ([]byte, error) {
func checkAndRemoveUnderscoresFloats(b []byte) ([]byte, error) { func checkAndRemoveUnderscoresFloats(b []byte) ([]byte, error) {
if b[0] == '_' { if b[0] == '_' {
return nil, newDecodeError(b[0:1], "number cannot start with underscore") return nil, unstable.NewParserError(b[0:1], "number cannot start with underscore")
} }
if b[len(b)-1] == '_' { if b[len(b)-1] == '_' {
return nil, newDecodeError(b[len(b)-1:], "number cannot end with underscore") return nil, unstable.NewParserError(b[len(b)-1:], "number cannot end with underscore")
} }
// fast path // fast path
@@ -476,10 +478,10 @@ func checkAndRemoveUnderscoresFloats(b []byte) ([]byte, error) {
switch c { switch c {
case '_': case '_':
if !before { if !before {
return nil, newDecodeError(b[i-1:i+1], "number must have at least one digit between underscores") return nil, unstable.NewParserError(b[i-1:i+1], "number must have at least one digit between underscores")
} }
if i < len(b)-1 && (b[i+1] == 'e' || b[i+1] == 'E') { if i < len(b)-1 && (b[i+1] == 'e' || b[i+1] == 'E') {
return nil, newDecodeError(b[i+1:i+2], "cannot have underscore before exponent") return nil, unstable.NewParserError(b[i+1:i+2], "cannot have underscore before exponent")
} }
before = false before = false
case '+', '-': case '+', '-':
@@ -488,15 +490,15 @@ func checkAndRemoveUnderscoresFloats(b []byte) ([]byte, error) {
before = false before = false
case 'e', 'E': case 'e', 'E':
if i < len(b)-1 && b[i+1] == '_' { if i < len(b)-1 && b[i+1] == '_' {
return nil, newDecodeError(b[i+1:i+2], "cannot have underscore after exponent") return nil, unstable.NewParserError(b[i+1:i+2], "cannot have underscore after exponent")
} }
cleaned = append(cleaned, c) cleaned = append(cleaned, c)
case '.': case '.':
if i < len(b)-1 && b[i+1] == '_' { if i < len(b)-1 && b[i+1] == '_' {
return nil, newDecodeError(b[i+1:i+2], "cannot have underscore after decimal point") return nil, unstable.NewParserError(b[i+1:i+2], "cannot have underscore after decimal point")
} }
if i > 0 && b[i-1] == '_' { if i > 0 && b[i-1] == '_' {
return nil, newDecodeError(b[i-1:i], "cannot have underscore before decimal point") return nil, unstable.NewParserError(b[i-1:i], "cannot have underscore before decimal point")
} }
cleaned = append(cleaned, c) cleaned = append(cleaned, c)
default: default:
@@ -542,3 +544,7 @@ func daysIn(m int, year int) int {
func isLeap(year int) bool { func isLeap(year int) bool {
return year%4 == 0 && (year%100 != 0 || year%400 == 0) return year%4 == 0 && (year%100 != 0 || year%400 == 0)
} }
func isDigit(r byte) bool {
return r >= '0' && r <= '9'
}
+8 -25
View File
@@ -6,6 +6,7 @@ import (
"strings" "strings"
"github.com/pelletier/go-toml/v2/internal/danger" "github.com/pelletier/go-toml/v2/internal/danger"
"github.com/pelletier/go-toml/v2/unstable"
) )
// DecodeError represents an error encountered during the parsing or decoding // DecodeError represents an error encountered during the parsing or decoding
@@ -55,25 +56,6 @@ func (s *StrictMissingError) String() string {
type Key []string type Key []string
// internal version of DecodeError that is used as the base to create a
// DecodeError with full context.
type decodeError struct {
highlight []byte
message string
key Key // optional
}
func (de *decodeError) Error() string {
return de.message
}
func newDecodeError(highlight []byte, format string, args ...interface{}) error {
return &decodeError{
highlight: highlight,
message: fmt.Errorf(format, args...).Error(),
}
}
// Error returns the error message contained in the DecodeError. // Error returns the error message contained in the DecodeError.
func (e *DecodeError) Error() string { func (e *DecodeError) Error() string {
return "toml: " + e.message return "toml: " + e.message
@@ -103,13 +85,14 @@ func (e *DecodeError) Key() Key {
// //
// The function copies all bytes used in DecodeError, so that document and // The function copies all bytes used in DecodeError, so that document and
// highlight can be freely deallocated. // highlight can be freely deallocated.
//
//nolint:funlen //nolint:funlen
func wrapDecodeError(document []byte, de *decodeError) *DecodeError { func wrapDecodeError(document []byte, de *unstable.ParserError) *DecodeError {
offset := danger.SubsliceOffset(document, de.highlight) offset := danger.SubsliceOffset(document, de.Highlight)
errMessage := de.Error() errMessage := de.Error()
errLine, errColumn := positionAtEnd(document[:offset]) errLine, errColumn := positionAtEnd(document[:offset])
before, after := linesOfContext(document, de.highlight, offset, 3) before, after := linesOfContext(document, de.Highlight, offset, 3)
var buf strings.Builder var buf strings.Builder
@@ -139,7 +122,7 @@ func wrapDecodeError(document []byte, de *decodeError) *DecodeError {
buf.Write(before[0]) buf.Write(before[0])
} }
buf.Write(de.highlight) buf.Write(de.Highlight)
if len(after) > 0 { if len(after) > 0 {
buf.Write(after[0]) buf.Write(after[0])
@@ -157,7 +140,7 @@ func wrapDecodeError(document []byte, de *decodeError) *DecodeError {
buf.WriteString(strings.Repeat(" ", len(before[0]))) buf.WriteString(strings.Repeat(" ", len(before[0])))
} }
buf.WriteString(strings.Repeat("~", len(de.highlight))) buf.WriteString(strings.Repeat("~", len(de.Highlight)))
if len(errMessage) > 0 { if len(errMessage) > 0 {
buf.WriteString(" ") buf.WriteString(" ")
@@ -182,7 +165,7 @@ func wrapDecodeError(document []byte, de *decodeError) *DecodeError {
message: errMessage, message: errMessage,
line: errLine, line: errLine,
column: errColumn, column: errColumn,
key: de.key, key: de.Key,
human: buf.String(), human: buf.String(),
} }
} }
+4 -3
View File
@@ -7,6 +7,7 @@ import (
"strings" "strings"
"testing" "testing"
"github.com/pelletier/go-toml/v2/unstable"
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
) )
@@ -170,9 +171,9 @@ line 5`,
doc := b.Bytes() doc := b.Bytes()
hl := doc[start:end] hl := doc[start:end]
err := wrapDecodeError(doc, &decodeError{ err := wrapDecodeError(doc, &unstable.ParserError{
highlight: hl, Highlight: hl,
message: e.msg, Message: e.msg,
}) })
var derr *DecodeError var derr *DecodeError
+8 -1
View File
@@ -7,13 +7,20 @@ import (
"github.com/stretchr/testify/require" "github.com/stretchr/testify/require"
) )
func TestFastSimple(t *testing.T) { func TestFastSimpleInt(t *testing.T) {
m := map[string]int64{} m := map[string]int64{}
err := toml.Unmarshal([]byte(`a = 42`), &m) err := toml.Unmarshal([]byte(`a = 42`), &m)
require.NoError(t, err) require.NoError(t, err)
require.Equal(t, map[string]int64{"a": 42}, m) require.Equal(t, map[string]int64{"a": 42}, m)
} }
func TestFastSimpleFloat(t *testing.T) {
m := map[string]float64{}
err := toml.Unmarshal([]byte("a = 42\nb = 1.1\nc = 12341234123412341234123412341234"), &m)
require.NoError(t, err)
require.Equal(t, map[string]float64{"a": 42, "b": 1.1, "c": 1.2341234123412342e+31}, m)
}
func TestFastSimpleString(t *testing.T) { func TestFastSimpleString(t *testing.T) {
m := map[string]string{} m := map[string]string{}
err := toml.Unmarshal([]byte(`a = "hello"`), &m) err := toml.Unmarshal([]byte(`a = "hello"`), &m)
+2 -2
View File
@@ -1,5 +1,5 @@
//go:build go1.18 //go:build go1.18 || go1.19 || go1.20
// +build go1.18 // +build go1.18 go1.19 go1.20
package toml_test package toml_test
+1 -1
View File
@@ -2,4 +2,4 @@ module github.com/pelletier/go-toml/v2
go 1.16 go 1.16
require github.com/stretchr/testify v1.7.1 require github.com/stretchr/testify v1.8.1
+9 -3
View File
@@ -1,11 +1,17 @@
github.com/davecgh/go-spew v1.1.0 h1:ZDRjVQ15GmhC3fiQ8ni8+OwkZQO4DARzQgrnXU1Liz8=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38= github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM= github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4= github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME= github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.7.1 h1:5TQK59W5E3v0r2duFAb7P95B6hEeOyEnHRa8MjYSMTY= github.com/stretchr/objx v0.4.0/go.mod h1:YvHI0jy2hoMjB+UWwv71VJQ9isScKT/TqJzVSSt89Yw=
github.com/stretchr/objx v0.5.0/go.mod h1:Yh+to48EsGEfYuaHDzXPcE3xhTkx73EhmCGUpEOglKo=
github.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg= github.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/h/Wwjteg=
github.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=
github.com/stretchr/testify v1.8.1 h1:w7B6lhMri9wdJUVmEZPGGhZzrYTPvgJArz7wNPgYKsk=
github.com/stretchr/testify v1.8.1/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o6fzry7u4=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM= gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0= gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c h1:dUUwHk2QECo/6vqA44rthZ8ie2QXMNeKRTHCNY2nXvo=
gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM= gopkg.in/yaml.v3 v3.0.0-20200313102051-9f266ea9e77c/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
-51
View File
@@ -1,51 +0,0 @@
package ast
type Reference int
const InvalidReference Reference = -1
func (r Reference) Valid() bool {
return r != InvalidReference
}
type Builder struct {
tree Root
lastIdx int
}
func (b *Builder) Tree() *Root {
return &b.tree
}
func (b *Builder) NodeAt(ref Reference) *Node {
return b.tree.at(ref)
}
func (b *Builder) Reset() {
b.tree.nodes = b.tree.nodes[:0]
b.lastIdx = 0
}
func (b *Builder) Push(n Node) Reference {
b.lastIdx = len(b.tree.nodes)
b.tree.nodes = append(b.tree.nodes, n)
return Reference(b.lastIdx)
}
func (b *Builder) PushAndChain(n Node) Reference {
newIdx := len(b.tree.nodes)
b.tree.nodes = append(b.tree.nodes, n)
if b.lastIdx >= 0 {
b.tree.nodes[b.lastIdx].next = newIdx - b.lastIdx
}
b.lastIdx = newIdx
return Reference(b.lastIdx)
}
func (b *Builder) AttachChild(parent Reference, child Reference) {
b.tree.nodes[parent].child = int(child) - int(parent)
}
func (b *Builder) Chain(from Reference, to Reference) {
b.tree.nodes[from].next = int(to) - int(from)
}
+42
View File
@@ -0,0 +1,42 @@
package characters
var invalidAsciiTable = [256]bool{
0x00: true,
0x01: true,
0x02: true,
0x03: true,
0x04: true,
0x05: true,
0x06: true,
0x07: true,
0x08: true,
// 0x09 TAB
// 0x0A LF
0x0B: true,
0x0C: true,
// 0x0D CR
0x0E: true,
0x0F: true,
0x10: true,
0x11: true,
0x12: true,
0x13: true,
0x14: true,
0x15: true,
0x16: true,
0x17: true,
0x18: true,
0x19: true,
0x1A: true,
0x1B: true,
0x1C: true,
0x1D: true,
0x1E: true,
0x1F: true,
// 0x20 - 0x7E Printable ASCII characters
0x7F: true,
}
func InvalidAscii(b byte) bool {
return invalidAsciiTable[b]
}
+6 -47
View File
@@ -1,4 +1,4 @@
package toml package characters
import ( import (
"unicode/utf8" "unicode/utf8"
@@ -32,7 +32,7 @@ func (u utf8Err) Zero() bool {
// 0x9 => tab, ok // 0x9 => tab, ok
// 0xA - 0x1F => invalid // 0xA - 0x1F => invalid
// 0x7F => invalid // 0x7F => invalid
func utf8TomlValidAlreadyEscaped(p []byte) (err utf8Err) { func Utf8TomlValidAlreadyEscaped(p []byte) (err utf8Err) {
// Fast path. Check for and skip 8 bytes of ASCII characters per iteration. // Fast path. Check for and skip 8 bytes of ASCII characters per iteration.
offset := 0 offset := 0
for len(p) >= 8 { for len(p) >= 8 {
@@ -48,7 +48,7 @@ func utf8TomlValidAlreadyEscaped(p []byte) (err utf8Err) {
} }
for i, b := range p[:8] { for i, b := range p[:8] {
if invalidAscii(b) { if InvalidAscii(b) {
err.Index = offset + i err.Index = offset + i
err.Size = 1 err.Size = 1
return return
@@ -62,7 +62,7 @@ func utf8TomlValidAlreadyEscaped(p []byte) (err utf8Err) {
for i := 0; i < n; { for i := 0; i < n; {
pi := p[i] pi := p[i]
if pi < utf8.RuneSelf { if pi < utf8.RuneSelf {
if invalidAscii(pi) { if InvalidAscii(pi) {
err.Index = offset + i err.Index = offset + i
err.Size = 1 err.Size = 1
return return
@@ -106,11 +106,11 @@ func utf8TomlValidAlreadyEscaped(p []byte) (err utf8Err) {
} }
// Return the size of the next rune if valid, 0 otherwise. // Return the size of the next rune if valid, 0 otherwise.
func utf8ValidNext(p []byte) int { func Utf8ValidNext(p []byte) int {
c := p[0] c := p[0]
if c < utf8.RuneSelf { if c < utf8.RuneSelf {
if invalidAscii(c) { if InvalidAscii(c) {
return 0 return 0
} }
return 1 return 1
@@ -140,47 +140,6 @@ func utf8ValidNext(p []byte) int {
return size return size
} }
var invalidAsciiTable = [256]bool{
0x00: true,
0x01: true,
0x02: true,
0x03: true,
0x04: true,
0x05: true,
0x06: true,
0x07: true,
0x08: true,
// 0x09 TAB
// 0x0A LF
0x0B: true,
0x0C: true,
// 0x0D CR
0x0E: true,
0x0F: true,
0x10: true,
0x11: true,
0x12: true,
0x13: true,
0x14: true,
0x15: true,
0x16: true,
0x17: true,
0x18: true,
0x19: true,
0x1A: true,
0x1B: true,
0x1C: true,
0x1D: true,
0x1E: true,
0x1F: true,
// 0x20 - 0x7E Printable ASCII characters
0x7F: true,
}
func invalidAscii(b byte) bool {
return invalidAsciiTable[b]
}
// acceptRange gives the range of valid values for the second byte in a UTF-8 // acceptRange gives the range of valid values for the second byte in a UTF-8
// sequence. // sequence.
type acceptRange struct { type acceptRange struct {
+1 -1
View File
@@ -18,7 +18,7 @@ type Program struct {
Usage string Usage string
Fn ConvertFn Fn ConvertFn
// Inplace allows the command to take more than one file as argument and // Inplace allows the command to take more than one file as argument and
// perform convertion in place on each provided file. // perform conversion in place on each provided file.
Inplace bool Inplace bool
} }
@@ -67,6 +67,7 @@ func TestDocMarshal(t *testing.T) {
} }
marshalTestToml := `title = 'TOML Marshal Testing' marshalTestToml := `title = 'TOML Marshal Testing'
[basic_lists] [basic_lists]
floats = [12.3, 45.6, 78.9] floats = [12.3, 45.6, 78.9]
bools = [true, false, true] bools = [true, false, true]
@@ -89,7 +90,6 @@ name = 'Second'
[subdoc.first] [subdoc.first]
name = 'First' name = 'First'
[basic] [basic]
uint = 5001 uint = 5001
bool = true bool = true
@@ -101,9 +101,9 @@ date = 1979-05-27T07:32:00Z
[[subdoclist]] [[subdoclist]]
name = 'List.First' name = 'List.First'
[[subdoclist]] [[subdoclist]]
name = 'List.Second' name = 'List.Second'
` `
result, err := toml.Marshal(docData) result, err := toml.Marshal(docData)
@@ -117,14 +117,15 @@ func TestBasicMarshalQuotedKey(t *testing.T) {
expected := `'Z.string-àéù' = 'Hello' expected := `'Z.string-àéù' = 'Hello'
'Yfloat-𝟘' = 3.5 'Yfloat-𝟘' = 3.5
['Xsubdoc-àéù'] ['Xsubdoc-àéù']
String2 = 'One' String2 = 'One'
[['W.sublist-𝟘']] [['W.sublist-𝟘']]
String2 = 'Two' String2 = 'Two'
[['W.sublist-𝟘']] [['W.sublist-𝟘']]
String2 = 'Three' String2 = 'Three'
` `
require.Equal(t, string(expected), string(result)) require.Equal(t, string(expected), string(result))
@@ -159,8 +160,8 @@ bool = false
int = 0 int = 0
string = '' string = ''
stringlist = [] stringlist = []
[map]
[map]
` `
require.Equal(t, string(expected), string(result)) require.Equal(t, string(expected), string(result))
@@ -151,6 +151,7 @@ type quotedKeyMarshalTestStruct struct {
} }
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var quotedKeyMarshalTestData = quotedKeyMarshalTestStruct{ var quotedKeyMarshalTestData = quotedKeyMarshalTestStruct{
String: "Hello", String: "Hello",
@@ -160,6 +161,7 @@ var quotedKeyMarshalTestData = quotedKeyMarshalTestStruct{
} }
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var quotedKeyMarshalTestToml = []byte(`"Yfloat-𝟘" = 3.5 var quotedKeyMarshalTestToml = []byte(`"Yfloat-𝟘" = 3.5
"Z.string-àéù" = "Hello" "Z.string-àéù" = "Hello"
@@ -272,6 +274,7 @@ var docData = testDoc{
} }
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var mapTestDoc = testMapDoc{ var mapTestDoc = testMapDoc{
Title: "TOML Marshal Testing", Title: "TOML Marshal Testing",
@@ -559,10 +562,12 @@ func (c customMarshaler) MarshalTOML() ([]byte, error) {
var customMarshalerData = customMarshaler{FirstName: "Sally", LastName: "Fields"} var customMarshalerData = customMarshaler{FirstName: "Sally", LastName: "Fields"}
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var customMarshalerToml = []byte(`Sally Fields`) var customMarshalerToml = []byte(`Sally Fields`)
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var nestedCustomMarshalerData = customMarshalerParent{ var nestedCustomMarshalerData = customMarshalerParent{
Self: customMarshaler{FirstName: "Maiku", LastName: "Suteda"}, Self: customMarshaler{FirstName: "Maiku", LastName: "Suteda"},
@@ -570,6 +575,7 @@ var nestedCustomMarshalerData = customMarshalerParent{
} }
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var nestedCustomMarshalerToml = []byte(`friends = ["Sally Fields"] var nestedCustomMarshalerToml = []byte(`friends = ["Sally Fields"]
me = "Maiku Suteda" me = "Maiku Suteda"
@@ -611,6 +617,7 @@ func TestUnmarshalTextMarshaler(t *testing.T) {
} }
// TODO: Remove nolint once type and methods are used by a test // TODO: Remove nolint once type and methods are used by a test
//
//nolint:unused //nolint:unused
type precedentMarshaler struct { type precedentMarshaler struct {
FirstName string FirstName string
@@ -629,6 +636,7 @@ func (m precedentMarshaler) MarshalTOML() ([]byte, error) {
} }
// TODO: Remove nolint once type and method are used by a test // TODO: Remove nolint once type and method are used by a test
//
//nolint:unused //nolint:unused
type customPointerMarshaler struct { type customPointerMarshaler struct {
FirstName string FirstName string
@@ -641,6 +649,7 @@ func (m *customPointerMarshaler) MarshalTOML() ([]byte, error) {
} }
// TODO: Remove nolint once type and method are used by a test // TODO: Remove nolint once type and method are used by a test
//
//nolint:unused //nolint:unused
type textPointerMarshaler struct { type textPointerMarshaler struct {
FirstName string FirstName string
@@ -653,6 +662,7 @@ func (m *textPointerMarshaler) MarshalText() ([]byte, error) {
} }
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var commentTestToml = []byte(` var commentTestToml = []byte(`
# it's a comment on type # it's a comment on type
@@ -690,6 +700,7 @@ type mapsTestStruct struct {
} }
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var mapsTestData = mapsTestStruct{ var mapsTestData = mapsTestStruct{
Simple: map[string]string{ Simple: map[string]string{
@@ -713,6 +724,7 @@ var mapsTestData = mapsTestStruct{
} }
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var mapsTestToml = []byte(` var mapsTestToml = []byte(`
[Other] [Other]
@@ -735,6 +747,7 @@ var mapsTestToml = []byte(`
`) `)
// TODO: Remove nolint once type is used by a test // TODO: Remove nolint once type is used by a test
//
//nolint:deadcode,unused //nolint:deadcode,unused
type structArrayNoTag struct { type structArrayNoTag struct {
A struct { A struct {
@@ -744,6 +757,7 @@ type structArrayNoTag struct {
} }
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var customTagTestToml = []byte(` var customTagTestToml = []byte(`
[postgres] [postgres]
@@ -758,6 +772,7 @@ var customTagTestToml = []byte(`
`) `)
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var customCommentTagTestToml = []byte(` var customCommentTagTestToml = []byte(`
# db connection # db connection
@@ -771,6 +786,7 @@ var customCommentTagTestToml = []byte(`
`) `)
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var customCommentedTagTestToml = []byte(` var customCommentedTagTestToml = []byte(`
[postgres] [postgres]
@@ -825,6 +841,7 @@ func TestUnmarshalTabInStringAndQuotedKey(t *testing.T) {
} }
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var customMultilineTagTestToml = []byte(`int_slice = [ var customMultilineTagTestToml = []byte(`int_slice = [
1, 1,
@@ -834,6 +851,7 @@ var customMultilineTagTestToml = []byte(`int_slice = [
`) `)
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var testDocBasicToml = []byte(` var testDocBasicToml = []byte(`
[document] [document]
@@ -846,12 +864,14 @@ var testDocBasicToml = []byte(`
`) `)
// TODO: Remove nolint once type is used by a test // TODO: Remove nolint once type is used by a test
//
//nolint:deadcode //nolint:deadcode
type testDocCustomTag struct { type testDocCustomTag struct {
Doc testDocBasicsCustomTag `file:"document"` Doc testDocBasicsCustomTag `file:"document"`
} }
// TODO: Remove nolint once type is used by a test // TODO: Remove nolint once type is used by a test
//
//nolint:deadcode //nolint:deadcode
type testDocBasicsCustomTag struct { type testDocBasicsCustomTag struct {
Bool bool `file:"bool_val"` Bool bool `file:"bool_val"`
@@ -864,6 +884,7 @@ type testDocBasicsCustomTag struct {
} }
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,varcheck //nolint:deadcode,varcheck
var testDocCustomTagData = testDocCustomTag{ var testDocCustomTagData = testDocCustomTag{
Doc: testDocBasicsCustomTag{ Doc: testDocBasicsCustomTag{
@@ -966,6 +987,7 @@ func TestUnmarshalInvalidPointerKind(t *testing.T) {
} }
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused //nolint:deadcode,unused
type testDuration struct { type testDuration struct {
Nanosec time.Duration `toml:"nanosec"` Nanosec time.Duration `toml:"nanosec"`
@@ -980,6 +1002,7 @@ type testDuration struct {
} }
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var testDurationToml = []byte(` var testDurationToml = []byte(`
nanosec = "1ns" nanosec = "1ns"
@@ -994,6 +1017,7 @@ a_string = "15s"
`) `)
// TODO: Remove nolint once var is used by a test // TODO: Remove nolint once var is used by a test
//
//nolint:deadcode,unused,varcheck //nolint:deadcode,unused,varcheck
var testDurationToml2 = []byte(`a_string = "15s" var testDurationToml2 = []byte(`a_string = "15s"
hour = "1h0m0s" hour = "1h0m0s"
@@ -1007,6 +1031,7 @@ sec = "1s"
`) `)
// TODO: Remove nolint once type is used by a test // TODO: Remove nolint once type is used by a test
//
//nolint:deadcode,unused //nolint:deadcode,unused
type testBadDuration struct { type testBadDuration struct {
Val time.Duration `toml:"val"` Val time.Duration `toml:"val"`
@@ -1060,10 +1085,6 @@ func TestUnmarshalCheckConversionFloatInt(t *testing.T) {
desc: "int", desc: "int",
input: `I = 1e300`, input: `I = 1e300`,
}, },
{
desc: "float",
input: `F = 9223372036854775806`,
},
} }
for _, test := range testCases { for _, test := range testCases {
+5 -7
View File
@@ -1,8 +1,6 @@
package tracker package tracker
import ( import "github.com/pelletier/go-toml/v2/unstable"
"github.com/pelletier/go-toml/v2/internal/ast"
)
// KeyTracker is a tracker that keeps track of the current Key as the AST is // KeyTracker is a tracker that keeps track of the current Key as the AST is
// walked. // walked.
@@ -11,19 +9,19 @@ type KeyTracker struct {
} }
// UpdateTable sets the state of the tracker with the AST table node. // UpdateTable sets the state of the tracker with the AST table node.
func (t *KeyTracker) UpdateTable(node *ast.Node) { func (t *KeyTracker) UpdateTable(node *unstable.Node) {
t.reset() t.reset()
t.Push(node) t.Push(node)
} }
// UpdateArrayTable sets the state of the tracker with the AST array table node. // UpdateArrayTable sets the state of the tracker with the AST array table node.
func (t *KeyTracker) UpdateArrayTable(node *ast.Node) { func (t *KeyTracker) UpdateArrayTable(node *unstable.Node) {
t.reset() t.reset()
t.Push(node) t.Push(node)
} }
// Push the given key on the stack. // Push the given key on the stack.
func (t *KeyTracker) Push(node *ast.Node) { func (t *KeyTracker) Push(node *unstable.Node) {
it := node.Key() it := node.Key()
for it.Next() { for it.Next() {
t.k = append(t.k, string(it.Node().Data)) t.k = append(t.k, string(it.Node().Data))
@@ -31,7 +29,7 @@ func (t *KeyTracker) Push(node *ast.Node) {
} }
// Pop key from stack. // Pop key from stack.
func (t *KeyTracker) Pop(node *ast.Node) { func (t *KeyTracker) Pop(node *unstable.Node) {
it := node.Key() it := node.Key()
for it.Next() { for it.Next() {
t.k = t.k[:len(t.k)-1] t.k = t.k[:len(t.k)-1]
+14 -14
View File
@@ -5,7 +5,7 @@ import (
"fmt" "fmt"
"sync" "sync"
"github.com/pelletier/go-toml/v2/internal/ast" "github.com/pelletier/go-toml/v2/unstable"
) )
type keyKind uint8 type keyKind uint8
@@ -150,23 +150,23 @@ func (s *SeenTracker) setExplicitFlag(parentIdx int) {
// CheckExpression takes a top-level node and checks that it does not contain // CheckExpression takes a top-level node and checks that it does not contain
// keys that have been seen in previous calls, and validates that types are // keys that have been seen in previous calls, and validates that types are
// consistent. // consistent.
func (s *SeenTracker) CheckExpression(node *ast.Node) error { func (s *SeenTracker) CheckExpression(node *unstable.Node) error {
if s.entries == nil { if s.entries == nil {
s.reset() s.reset()
} }
switch node.Kind { switch node.Kind {
case ast.KeyValue: case unstable.KeyValue:
return s.checkKeyValue(node) return s.checkKeyValue(node)
case ast.Table: case unstable.Table:
return s.checkTable(node) return s.checkTable(node)
case ast.ArrayTable: case unstable.ArrayTable:
return s.checkArrayTable(node) return s.checkArrayTable(node)
default: default:
panic(fmt.Errorf("this should not be a top level node type: %s", node.Kind)) panic(fmt.Errorf("this should not be a top level node type: %s", node.Kind))
} }
} }
func (s *SeenTracker) checkTable(node *ast.Node) error { func (s *SeenTracker) checkTable(node *unstable.Node) error {
if s.currentIdx >= 0 { if s.currentIdx >= 0 {
s.setExplicitFlag(s.currentIdx) s.setExplicitFlag(s.currentIdx)
} }
@@ -219,7 +219,7 @@ func (s *SeenTracker) checkTable(node *ast.Node) error {
return nil return nil
} }
func (s *SeenTracker) checkArrayTable(node *ast.Node) error { func (s *SeenTracker) checkArrayTable(node *unstable.Node) error {
if s.currentIdx >= 0 { if s.currentIdx >= 0 {
s.setExplicitFlag(s.currentIdx) s.setExplicitFlag(s.currentIdx)
} }
@@ -267,7 +267,7 @@ func (s *SeenTracker) checkArrayTable(node *ast.Node) error {
return nil return nil
} }
func (s *SeenTracker) checkKeyValue(node *ast.Node) error { func (s *SeenTracker) checkKeyValue(node *unstable.Node) error {
parentIdx := s.currentIdx parentIdx := s.currentIdx
it := node.Key() it := node.Key()
@@ -297,26 +297,26 @@ func (s *SeenTracker) checkKeyValue(node *ast.Node) error {
value := node.Value() value := node.Value()
switch value.Kind { switch value.Kind {
case ast.InlineTable: case unstable.InlineTable:
return s.checkInlineTable(value) return s.checkInlineTable(value)
case ast.Array: case unstable.Array:
return s.checkArray(value) return s.checkArray(value)
} }
return nil return nil
} }
func (s *SeenTracker) checkArray(node *ast.Node) error { func (s *SeenTracker) checkArray(node *unstable.Node) error {
it := node.Children() it := node.Children()
for it.Next() { for it.Next() {
n := it.Node() n := it.Node()
switch n.Kind { switch n.Kind {
case ast.InlineTable: case unstable.InlineTable:
err := s.checkInlineTable(n) err := s.checkInlineTable(n)
if err != nil { if err != nil {
return err return err
} }
case ast.Array: case unstable.Array:
err := s.checkArray(n) err := s.checkArray(n)
if err != nil { if err != nil {
return err return err
@@ -326,7 +326,7 @@ func (s *SeenTracker) checkArray(node *ast.Node) error {
return nil return nil
} }
func (s *SeenTracker) checkInlineTable(node *ast.Node) error { func (s *SeenTracker) checkInlineTable(node *unstable.Node) error {
if pool.New == nil { if pool.New == nil {
pool.New = func() interface{} { pool.New = func() interface{} {
return &SeenTracker{} return &SeenTracker{}
+4 -2
View File
@@ -4,6 +4,8 @@ import (
"fmt" "fmt"
"strings" "strings"
"time" "time"
"github.com/pelletier/go-toml/v2/unstable"
) )
// LocalDate represents a calendar day in no specific timezone. // LocalDate represents a calendar day in no specific timezone.
@@ -75,7 +77,7 @@ func (d LocalTime) MarshalText() ([]byte, error) {
func (d *LocalTime) UnmarshalText(b []byte) error { func (d *LocalTime) UnmarshalText(b []byte) error {
res, left, err := parseLocalTime(b) res, left, err := parseLocalTime(b)
if err == nil && len(left) != 0 { if err == nil && len(left) != 0 {
err = newDecodeError(left, "extra characters") err = unstable.NewParserError(left, "extra characters")
} }
if err != nil { if err != nil {
return err return err
@@ -109,7 +111,7 @@ func (d LocalDateTime) MarshalText() ([]byte, error) {
func (d *LocalDateTime) UnmarshalText(data []byte) error { func (d *LocalDateTime) UnmarshalText(data []byte) error {
res, left, err := parseLocalDateTime(data) res, left, err := parseLocalDateTime(data)
if err == nil && len(left) != 0 { if err == nil && len(left) != 0 {
err = newDecodeError(left, "extra characters") err = unstable.NewParserError(left, "extra characters")
} }
if err != nil { if err != nil {
return err return err
+110 -18
View File
@@ -12,6 +12,8 @@ import (
"strings" "strings"
"time" "time"
"unicode" "unicode"
"github.com/pelletier/go-toml/v2/internal/characters"
) )
// Marshal serializes a Go value as a TOML document. // Marshal serializes a Go value as a TOML document.
@@ -54,7 +56,7 @@ func NewEncoder(w io.Writer) *Encoder {
// This behavior can be controlled on an individual struct field basis with the // This behavior can be controlled on an individual struct field basis with the
// inline tag: // inline tag:
// //
// MyField `inline:"true"` // MyField `toml:",inline"`
func (enc *Encoder) SetTablesInline(inline bool) *Encoder { func (enc *Encoder) SetTablesInline(inline bool) *Encoder {
enc.tablesInline = inline enc.tablesInline = inline
return enc return enc
@@ -65,7 +67,7 @@ func (enc *Encoder) SetTablesInline(inline bool) *Encoder {
// //
// This behavior can be controlled on an individual struct field basis with the multiline tag: // This behavior can be controlled on an individual struct field basis with the multiline tag:
// //
// MyField `multiline:"true"` // MyField `multiline:"true"`
func (enc *Encoder) SetArraysMultiline(multiline bool) *Encoder { func (enc *Encoder) SetArraysMultiline(multiline bool) *Encoder {
enc.arraysMultiline = multiline enc.arraysMultiline = multiline
return enc return enc
@@ -89,7 +91,7 @@ func (enc *Encoder) SetIndentTables(indent bool) *Encoder {
// //
// If v cannot be represented to TOML it returns an error. // If v cannot be represented to TOML it returns an error.
// //
// Encoding rules // # Encoding rules
// //
// A top level slice containing only maps or structs is encoded as [[table // A top level slice containing only maps or structs is encoded as [[table
// array]]. // array]].
@@ -107,10 +109,30 @@ func (enc *Encoder) SetIndentTables(indent bool) *Encoder {
// a newline character or a single quote. In that case they are emitted as // a newline character or a single quote. In that case they are emitted as
// quoted strings. // quoted strings.
// //
// Unsigned integers larger than math.MaxInt64 cannot be encoded. Doing so
// results in an error. This rule exists because the TOML specification only
// requires parsers to support at least the 64 bits integer range. Allowing
// larger numbers would create non-standard TOML documents, which may not be
// readable (at best) by other implementations. To encode such numbers, a
// solution is a custom type that implements encoding.TextMarshaler.
//
// When encoding structs, fields are encoded in order of definition, with their // When encoding structs, fields are encoded in order of definition, with their
// exact name. // exact name.
// //
// Struct tags // Tables and array tables are separated by empty lines. However, consecutive
// subtables definitions are not. For example:
//
// [top1]
//
// [top2]
// [top2.child1]
//
// [[array]]
//
// [[array]]
// [array.child2]
//
// # Struct tags
// //
// The encoding of each public struct field can be customized by the format // The encoding of each public struct field can be customized by the format
// string in the "toml" key of the struct field's tag. This follows // string in the "toml" key of the struct field's tag. This follows
@@ -128,7 +150,8 @@ func (enc *Encoder) SetIndentTables(indent bool) *Encoder {
// //
// In addition to the "toml" tag struct tag, a "comment" tag can be used to emit // In addition to the "toml" tag struct tag, a "comment" tag can be used to emit
// a TOML comment before the value being annotated. Comments are ignored inside // a TOML comment before the value being annotated. Comments are ignored inside
// inline tables. // inline tables. For array tables, the comment is only present before the first
// element of the array.
func (enc *Encoder) Encode(v interface{}) error { func (enc *Encoder) Encode(v interface{}) error {
var ( var (
b []byte b []byte
@@ -302,7 +325,11 @@ func (enc *Encoder) encode(b []byte, ctx encoderCtx, v reflect.Value) ([]byte, e
b = append(b, "false"...) b = append(b, "false"...)
} }
case reflect.Uint64, reflect.Uint32, reflect.Uint16, reflect.Uint8, reflect.Uint: case reflect.Uint64, reflect.Uint32, reflect.Uint16, reflect.Uint8, reflect.Uint:
b = strconv.AppendUint(b, v.Uint(), 10) x := v.Uint()
if x > uint64(math.MaxInt64) {
return nil, fmt.Errorf("toml: not encoding uint (%d) greater than max int64 (%d)", x, int64(math.MaxInt64))
}
b = strconv.AppendUint(b, x, 10)
case reflect.Int64, reflect.Int32, reflect.Int16, reflect.Int8, reflect.Int: case reflect.Int64, reflect.Int32, reflect.Int16, reflect.Int8, reflect.Int:
b = strconv.AppendInt(b, v.Int(), 10) b = strconv.AppendInt(b, v.Int(), 10)
default: default:
@@ -321,18 +348,18 @@ func isNil(v reflect.Value) bool {
} }
} }
func shouldOmitEmpty(options valueOptions, v reflect.Value) bool {
return options.omitempty && isEmptyValue(v)
}
func (enc *Encoder) encodeKv(b []byte, ctx encoderCtx, options valueOptions, v reflect.Value) ([]byte, error) { func (enc *Encoder) encodeKv(b []byte, ctx encoderCtx, options valueOptions, v reflect.Value) ([]byte, error) {
var err error var err error
if (ctx.options.omitempty || options.omitempty) && isEmptyValue(v) {
return b, nil
}
if !ctx.inline { if !ctx.inline {
b = enc.encodeComment(ctx.indent, options.comment, b) b = enc.encodeComment(ctx.indent, options.comment, b)
b = enc.indent(ctx.indent, b)
} }
b = enc.indent(ctx.indent, b)
b = enc.encodeKey(b, ctx.key) b = enc.encodeKey(b, ctx.key)
b = append(b, " = "...) b = append(b, " = "...)
@@ -353,6 +380,8 @@ func (enc *Encoder) encodeKv(b []byte, ctx encoderCtx, options valueOptions, v r
func isEmptyValue(v reflect.Value) bool { func isEmptyValue(v reflect.Value) bool {
switch v.Kind() { switch v.Kind() {
case reflect.Struct:
return isEmptyStruct(v)
case reflect.Array, reflect.Map, reflect.Slice, reflect.String: case reflect.Array, reflect.Map, reflect.Slice, reflect.String:
return v.Len() == 0 return v.Len() == 0
case reflect.Bool: case reflect.Bool:
@@ -369,6 +398,34 @@ func isEmptyValue(v reflect.Value) bool {
return false return false
} }
func isEmptyStruct(v reflect.Value) bool {
// TODO: merge with walkStruct and cache.
typ := v.Type()
for i := 0; i < typ.NumField(); i++ {
fieldType := typ.Field(i)
// only consider exported fields
if fieldType.PkgPath != "" {
continue
}
tag := fieldType.Tag.Get("toml")
// special field name to skip field
if tag == "-" {
continue
}
f := v.Field(i)
if !isEmptyValue(f) {
return false
}
}
return true
}
const literalQuote = '\'' const literalQuote = '\''
func (enc *Encoder) encodeString(b []byte, v string, options valueOptions) []byte { func (enc *Encoder) encodeString(b []byte, v string, options valueOptions) []byte {
@@ -382,7 +439,7 @@ func (enc *Encoder) encodeString(b []byte, v string, options valueOptions) []byt
func needsQuoting(v string) bool { func needsQuoting(v string) bool {
// TODO: vectorize // TODO: vectorize
for _, b := range []byte(v) { for _, b := range []byte(v) {
if b == '\'' || b == '\r' || b == '\n' || invalidAscii(b) { if b == '\'' || b == '\r' || b == '\n' || characters.InvalidAscii(b) {
return true return true
} }
} }
@@ -398,7 +455,6 @@ func (enc *Encoder) encodeLiteralString(b []byte, v string) []byte {
return b return b
} }
//nolint:cyclop
func (enc *Encoder) encodeQuotedString(multiline bool, b []byte, v string) []byte { func (enc *Encoder) encodeQuotedString(multiline bool, b []byte, v string) []byte {
stringQuote := `"` stringQuote := `"`
@@ -652,10 +708,19 @@ func (enc *Encoder) encodeStruct(b []byte, ctx encoderCtx, v reflect.Value) ([]b
} }
func (enc *Encoder) encodeComment(indent int, comment string, b []byte) []byte { func (enc *Encoder) encodeComment(indent int, comment string, b []byte) []byte {
if comment != "" { for len(comment) > 0 {
var line string
idx := strings.IndexByte(comment, '\n')
if idx >= 0 {
line = comment[:idx]
comment = comment[idx+1:]
} else {
line = comment
comment = ""
}
b = enc.indent(indent, b) b = enc.indent(indent, b)
b = append(b, "# "...) b = append(b, "# "...)
b = append(b, comment...) b = append(b, line...)
b = append(b, '\n') b = append(b, '\n')
} }
return b return b
@@ -736,7 +801,13 @@ func (enc *Encoder) encodeTable(b []byte, ctx encoderCtx, t table) ([]byte, erro
} }
ctx.skipTableHeader = false ctx.skipTableHeader = false
hasNonEmptyKV := false
for _, kv := range t.kvs { for _, kv := range t.kvs {
if shouldOmitEmpty(kv.Options, kv.Value) {
continue
}
hasNonEmptyKV = true
ctx.setKey(kv.Key) ctx.setKey(kv.Key)
b, err = enc.encodeKv(b, ctx, kv.Options, kv.Value) b, err = enc.encodeKv(b, ctx, kv.Options, kv.Value)
@@ -747,7 +818,20 @@ func (enc *Encoder) encodeTable(b []byte, ctx encoderCtx, t table) ([]byte, erro
b = append(b, '\n') b = append(b, '\n')
} }
first := true
for _, table := range t.tables { for _, table := range t.tables {
if shouldOmitEmpty(table.Options, table.Value) {
continue
}
if first {
first = false
if hasNonEmptyKV {
b = append(b, '\n')
}
} else {
b = append(b, "\n"...)
}
ctx.setKey(table.Key) ctx.setKey(table.Key)
ctx.options = table.Options ctx.options = table.Options
@@ -756,8 +840,6 @@ func (enc *Encoder) encodeTable(b []byte, ctx encoderCtx, t table) ([]byte, erro
if err != nil { if err != nil {
return nil, err return nil, err
} }
b = append(b, '\n')
} }
return b, nil return b, nil
@@ -770,6 +852,10 @@ func (enc *Encoder) encodeTableInline(b []byte, ctx encoderCtx, t table) ([]byte
first := true first := true
for _, kv := range t.kvs { for _, kv := range t.kvs {
if shouldOmitEmpty(kv.Options, kv.Value) {
continue
}
if first { if first {
first = false first = false
} else { } else {
@@ -785,7 +871,7 @@ func (enc *Encoder) encodeTableInline(b []byte, ctx encoderCtx, t table) ([]byte
} }
if len(t.tables) > 0 { if len(t.tables) > 0 {
panic("inline table cannot contain nested tables, online key-values") panic("inline table cannot contain nested tables, only key-values")
} }
b = append(b, "}"...) b = append(b, "}"...)
@@ -881,7 +967,13 @@ func (enc *Encoder) encodeSliceAsArrayTable(b []byte, ctx encoderCtx, v reflect.
scratch = append(scratch, "]]\n"...) scratch = append(scratch, "]]\n"...)
ctx.skipTableHeader = true ctx.skipTableHeader = true
b = enc.encodeComment(ctx.indent, ctx.options.comment, b)
for i := 0; i < v.Len(); i++ { for i := 0; i < v.Len(); i++ {
if i != 0 {
b = append(b, "\n"...)
}
b = append(b, scratch...) b = append(b, scratch...)
var err error var err error
+218 -93
View File
@@ -39,21 +39,21 @@ func TestMarshal(t *testing.T) {
v: map[string]string{ v: map[string]string{
"hello": "world", "hello": "world",
}, },
expected: "hello = 'world'", expected: "hello = 'world'\n",
}, },
{ {
desc: "map with new line in key", desc: "map with new line in key",
v: map[string]string{ v: map[string]string{
"hel\nlo": "world", "hel\nlo": "world",
}, },
expected: `"hel\nlo" = 'world'`, expected: "\"hel\\nlo\" = 'world'\n",
}, },
{ {
desc: `map with " in key`, desc: `map with " in key`,
v: map[string]string{ v: map[string]string{
`hel"lo`: "world", `hel"lo`: "world",
}, },
expected: `'hel"lo' = 'world'`, expected: "'hel\"lo' = 'world'\n",
}, },
{ {
desc: "map in map and string", desc: "map in map and string",
@@ -62,9 +62,9 @@ func TestMarshal(t *testing.T) {
"hello": "world", "hello": "world",
}, },
}, },
expected: ` expected: `[table]
[table] hello = 'world'
hello = 'world'`, `,
}, },
{ {
desc: "map in map in map and string", desc: "map in map in map and string",
@@ -75,10 +75,10 @@ hello = 'world'`,
}, },
}, },
}, },
expected: ` expected: `[this]
[this]
[this.is] [this.is]
a = 'test'`, a = 'test'
`,
}, },
{ {
desc: "map in map in map and string with values", desc: "map in map in map and string with values",
@@ -90,18 +90,20 @@ a = 'test'`,
"also": "that", "also": "that",
}, },
}, },
expected: ` expected: `[this]
[this]
also = 'that' also = 'that'
[this.is] [this.is]
a = 'test'`, a = 'test'
`,
}, },
{ {
desc: "simple string array", desc: "simple string array",
v: map[string][]string{ v: map[string][]string{
"array": {"one", "two", "three"}, "array": {"one", "two", "three"},
}, },
expected: `array = ['one', 'two', 'three']`, expected: `array = ['one', 'two', 'three']
`,
}, },
{ {
desc: "empty string array", desc: "empty string array",
@@ -118,14 +120,16 @@ a = 'test'`,
v: map[string][][]string{ v: map[string][][]string{
"array": {{"one", "two"}, {"three"}}, "array": {{"one", "two"}, {"three"}},
}, },
expected: `array = [['one', 'two'], ['three']]`, expected: `array = [['one', 'two'], ['three']]
`,
}, },
{ {
desc: "mixed strings and nested string arrays", desc: "mixed strings and nested string arrays",
v: map[string][]interface{}{ v: map[string][]interface{}{
"array": {"a string", []string{"one", "two"}, "last"}, "array": {"a string", []string{"one", "two"}, "last"},
}, },
expected: `array = ['a string', ['one', 'two'], 'last']`, expected: `array = ['a string', ['one', 'two'], 'last']
`,
}, },
{ {
desc: "array of maps", desc: "array of maps",
@@ -135,9 +139,9 @@ a = 'test'`,
{"map2.1": "v2.1"}, {"map2.1": "v2.1"},
}, },
}, },
expected: ` expected: `[[top]]
[[top]]
'map1.1' = 'v1.1' 'map1.1' = 'v1.1'
[[top]] [[top]]
'map2.1' = 'v2.1' 'map2.1' = 'v2.1'
`, `,
@@ -148,9 +152,9 @@ a = 'test'`,
"key1": "value1", "key1": "value1",
"key2": "value2", "key2": "value2",
}, },
expected: ` expected: `key1 = 'value1'
key1 = 'value1' key2 = 'value2'
key2 = 'value2'`, `,
}, },
{ {
desc: "simple struct", desc: "simple struct",
@@ -159,7 +163,8 @@ key2 = 'value2'`,
}{ }{
A: "foo", A: "foo",
}, },
expected: `A = 'foo'`, expected: `A = 'foo'
`,
}, },
{ {
desc: "one level of structs within structs", desc: "one level of structs within structs",
@@ -174,8 +179,7 @@ key2 = 'value2'`,
K2: "v2", K2: "v2",
}, },
}, },
expected: ` expected: `[A]
[A]
K1 = 'v1' K1 = 'v1'
K2 = 'v2' K2 = 'v2'
`, `,
@@ -190,10 +194,10 @@ K2 = 'v2'
}, },
}, },
}, },
expected: ` expected: `[root]
[root]
[[root.nested]] [[root.nested]]
name = 'Bob' name = 'Bob'
[[root.nested]] [[root.nested]]
name = 'Alice' name = 'Alice'
`, `,
@@ -203,49 +207,53 @@ name = 'Alice'
v: map[string]interface{}{ v: map[string]interface{}{
"a": "'\b\f\r\t\"\\", "a": "'\b\f\r\t\"\\",
}, },
expected: `a = "'\b\f\r\t\"\\"`, expected: `a = "'\b\f\r\t\"\\"
`,
}, },
{ {
desc: "string utf8 low", desc: "string utf8 low",
v: map[string]interface{}{ v: map[string]interface{}{
"a": "'Ę", "a": "'Ę",
}, },
expected: `a = "'Ę"`, expected: `a = "'Ę"
`,
}, },
{ {
desc: "string utf8 low 2", desc: "string utf8 low 2",
v: map[string]interface{}{ v: map[string]interface{}{
"a": "'\u10A85", "a": "'\u10A85",
}, },
expected: "a = \"'\u10A85\"", expected: "a = \"'\u10A85\"\n",
}, },
{ {
desc: "string utf8 low 2", desc: "string utf8 low 2",
v: map[string]interface{}{ v: map[string]interface{}{
"a": "'\u10A85", "a": "'\u10A85",
}, },
expected: "a = \"'\u10A85\"", expected: "a = \"'\u10A85\"\n",
}, },
{ {
desc: "emoji", desc: "emoji",
v: map[string]interface{}{ v: map[string]interface{}{
"a": "'😀", "a": "'😀",
}, },
expected: "a = \"'😀\"", expected: "a = \"'😀\"\n",
}, },
{ {
desc: "control char", desc: "control char",
v: map[string]interface{}{ v: map[string]interface{}{
"a": "'\u001A", "a": "'\u001A",
}, },
expected: `a = "'\u001A"`, expected: `a = "'\u001A"
`,
}, },
{ {
desc: "multi-line string", desc: "multi-line string",
v: map[string]interface{}{ v: map[string]interface{}{
"a": "hello\nworld", "a": "hello\nworld",
}, },
expected: `a = "hello\nworld"`, expected: `a = "hello\nworld"
`,
}, },
{ {
desc: "multi-line forced", desc: "multi-line forced",
@@ -256,7 +264,8 @@ name = 'Alice'
}, },
expected: `A = """ expected: `A = """
hello hello
world"""`, world"""
`,
}, },
{ {
desc: "inline field", desc: "inline field",
@@ -271,8 +280,8 @@ world"""`,
"isinline": "no", "isinline": "no",
}, },
}, },
expected: ` expected: `A = {isinline = 'yes'}
A = {isinline = 'yes'}
[B] [B]
isinline = 'no' isinline = 'no'
`, `,
@@ -286,8 +295,7 @@ isinline = 'no'
A: []int{1, 2, 3, 4}, A: []int{1, 2, 3, 4},
B: []int{1, 2, 3, 4}, B: []int{1, 2, 3, 4},
}, },
expected: ` expected: `A = [
A = [
1, 1,
2, 2,
3, 3,
@@ -303,8 +311,7 @@ B = [1, 2, 3, 4]
}{ }{
A: [][]int{{1, 2}, {3, 4}}, A: [][]int{{1, 2}, {3, 4}},
}, },
expected: ` expected: `A = [
A = [
[1, 2], [1, 2],
[3, 4] [3, 4]
] ]
@@ -329,7 +336,8 @@ A = [
}{ }{
A: []*int{nil}, A: []*int{nil},
}, },
expected: `A = [0]`, expected: `A = [0]
`,
}, },
{ {
desc: "nil pointer in slice uses zero value", desc: "nil pointer in slice uses zero value",
@@ -338,7 +346,8 @@ A = [
}{ }{
A: []*int{nil}, A: []*int{nil},
}, },
expected: `A = [0]`, expected: `A = [0]
`,
}, },
{ {
desc: "pointer in slice", desc: "pointer in slice",
@@ -347,7 +356,8 @@ A = [
}{ }{
A: []*int{&someInt}, A: []*int{&someInt},
}, },
expected: `A = [42]`, expected: `A = [42]
`,
}, },
{ {
desc: "inline table in inline table", desc: "inline table in inline table",
@@ -358,23 +368,25 @@ A = [
}, },
}, },
}, },
expected: `A = {A = {A = 'hello'}}`, expected: `A = {A = {A = 'hello'}}
`,
}, },
{ {
desc: "empty slice in map", desc: "empty slice in map",
v: map[string][]string{ v: map[string][]string{
"a": {}, "a": {},
}, },
expected: `a = []`, expected: `a = []
`,
}, },
{ {
desc: "map in slice", desc: "map in slice",
v: map[string][]map[string]string{ v: map[string][]map[string]string{
"a": {{"hello": "world"}}, "a": {{"hello": "world"}},
}, },
expected: ` expected: `[[a]]
[[a]] hello = 'world'
hello = 'world'`, `,
}, },
{ {
desc: "newline in map in slice", desc: "newline in map in slice",
@@ -382,7 +394,8 @@ hello = 'world'`,
"a\n": {{"hello": "world"}}, "a\n": {{"hello": "world"}},
}, },
expected: `[["a\n"]] expected: `[["a\n"]]
hello = 'world'`, hello = 'world'
`,
}, },
{ {
desc: "newline in map in slice", desc: "newline in map in slice",
@@ -398,7 +411,8 @@ hello = 'world'`,
}{ }{
A: []struct{}{}, A: []struct{}{},
}, },
expected: `A = []`, expected: `A = []
`,
}, },
{ {
desc: "nil field is ignored", desc: "nil field is ignored",
@@ -418,7 +432,8 @@ hello = 'world'`,
Public: "shown", Public: "shown",
private: "hidden", private: "hidden",
}, },
expected: `Public = 'shown'`, expected: `Public = 'shown'
`,
}, },
{ {
desc: "fields tagged - are ignored", desc: "fields tagged - are ignored",
@@ -442,7 +457,8 @@ hello = 'world'`,
v: map[string]interface{}{ v: map[string]interface{}{
"hello\nworld": 42, "hello\nworld": 42,
}, },
expected: `"hello\nworld" = 42`, expected: `"hello\nworld" = 42
`,
}, },
{ {
desc: "new line in parent of nested table key", desc: "new line in parent of nested table key",
@@ -452,7 +468,8 @@ hello = 'world'`,
}, },
}, },
expected: `["hello\nworld"] expected: `["hello\nworld"]
inner = 42`, inner = 42
`,
}, },
{ {
desc: "new line in nested table key", desc: "new line in nested table key",
@@ -465,7 +482,8 @@ inner = 42`,
}, },
expected: `[parent] expected: `[parent]
[parent."in\ner"] [parent."in\ner"]
foo = 42`, foo = 42
`,
}, },
{ {
desc: "invalid map key", desc: "invalid map key",
@@ -488,7 +506,8 @@ foo = 42`,
}{ }{
T: time.Time{}, T: time.Time{},
}, },
expected: `T = 0001-01-01T00:00:00Z`, expected: `T = 0001-01-01T00:00:00Z
`,
}, },
{ {
desc: "time nano", desc: "time nano",
@@ -497,7 +516,8 @@ foo = 42`,
}{ }{
T: time.Date(1979, time.May, 27, 0, 32, 0, 999999000, time.UTC), T: time.Date(1979, time.May, 27, 0, 32, 0, 999999000, time.UTC),
}, },
expected: `T = 1979-05-27T00:32:00.999999Z`, expected: `T = 1979-05-27T00:32:00.999999Z
`,
}, },
{ {
desc: "bool", desc: "bool",
@@ -508,9 +528,9 @@ foo = 42`,
A: false, A: false,
B: true, B: true,
}, },
expected: ` expected: `A = false
A = false B = true
B = true`, `,
}, },
{ {
desc: "numbers", desc: "numbers",
@@ -541,8 +561,7 @@ B = true`,
K: 42, K: 42,
L: 2.2, L: 2.2,
}, },
expected: ` expected: `A = 1.1
A = 1.1
B = 42 B = 42
C = 42 C = 42
D = 42 D = 42
@@ -553,7 +572,8 @@ H = 42
I = 42 I = 42
J = 42 J = 42
K = 42 K = 42
L = 2.2`, L = 2.2
`,
}, },
{ {
desc: "comments", desc: "comments",
@@ -566,8 +586,7 @@ L = 2.2`,
Three: []int{1, 2, 3}, Three: []int{1, 2, 3},
}, },
}, },
expected: ` expected: `# Before table
# Before table
[Table] [Table]
One = 1 One = 1
# Before kv # Before kv
@@ -589,7 +608,7 @@ Three = [1, 2, 3]
} }
require.NoError(t, err) require.NoError(t, err)
equalStringsIgnoreNewlines(t, e.expected, string(b)) assert.Equal(t, e.expected, string(b))
// make sure the output is always valid TOML // make sure the output is always valid TOML
defaultMap := map[string]interface{}{} defaultMap := map[string]interface{}{}
@@ -664,12 +683,6 @@ func testWithFlags(t *testing.T, flags int, setters flagsSetters, testfn func(t
} }
} }
func equalStringsIgnoreNewlines(t *testing.T, expected string, actual string) {
t.Helper()
cutset := "\n"
assert.Equal(t, strings.Trim(expected, cutset), strings.Trim(actual, cutset))
}
func TestMarshalFloats(t *testing.T) { func TestMarshalFloats(t *testing.T) {
v := map[string]float32{ v := map[string]float32{
"nan": float32(math.NaN()), "nan": float32(math.NaN()),
@@ -709,7 +722,8 @@ func TestMarshalIndentTables(t *testing.T) {
v: map[string]interface{}{ v: map[string]interface{}{
"foo": "bar", "foo": "bar",
}, },
expected: `foo = 'bar'`, expected: `foo = 'bar'
`,
}, },
{ {
desc: "one level table", desc: "one level table",
@@ -719,8 +733,7 @@ func TestMarshalIndentTables(t *testing.T) {
"two": "value2", "two": "value2",
}, },
}, },
expected: ` expected: `[foo]
[foo]
one = 'value1' one = 'value1'
two = 'value2' two = 'value2'
`, `,
@@ -736,10 +749,11 @@ func TestMarshalIndentTables(t *testing.T) {
}, },
}, },
}, },
expected: ` expected: `root = 'value0'
root = 'value0'
[level1] [level1]
one = 'value1' one = 'value1'
[level1.level2] [level1.level2]
two = 'value2' two = 'value2'
`, `,
@@ -754,7 +768,7 @@ root = 'value0'
enc.SetIndentTables(true) enc.SetIndentTables(true)
err := enc.Encode(e.v) err := enc.Encode(e.v)
require.NoError(t, err) require.NoError(t, err)
equalStringsIgnoreNewlines(t, e.expected, buf.String()) assert.Equal(t, e.expected, buf.String())
}) })
} }
} }
@@ -799,7 +813,7 @@ func TestMarshalTextMarshaler(t *testing.T) {
m := map[string]interface{}{"a": &customTextMarshaler{value: 2}} m := map[string]interface{}{"a": &customTextMarshaler{value: 2}}
r, err := toml.Marshal(m) r, err := toml.Marshal(m)
require.NoError(t, err) require.NoError(t, err)
equalStringsIgnoreNewlines(t, "a = '::2'", string(r)) assert.Equal(t, "a = '::2'\n", string(r))
} }
type brokenWriter struct{} type brokenWriter struct{}
@@ -822,10 +836,10 @@ func TestEncoderSetIndentSymbol(t *testing.T) {
enc.SetIndentSymbol(">>>") enc.SetIndentSymbol(">>>")
err := enc.Encode(map[string]map[string]string{"parent": {"hello": "world"}}) err := enc.Encode(map[string]map[string]string{"parent": {"hello": "world"}})
require.NoError(t, err) require.NoError(t, err)
expected := ` expected := `[parent]
[parent] >>>hello = 'world'
>>>hello = 'world'` `
equalStringsIgnoreNewlines(t, expected, w.String()) assert.Equal(t, expected, w.String())
} }
func TestEncoderOmitempty(t *testing.T) { func TestEncoderOmitempty(t *testing.T) {
@@ -856,9 +870,9 @@ func TestEncoderOmitempty(t *testing.T) {
b, err := toml.Marshal(d) b, err := toml.Marshal(d)
require.NoError(t, err) require.NoError(t, err)
expected := `[Struct]` expected := ``
equalStringsIgnoreNewlines(t, expected, string(b)) assert.Equal(t, expected, string(b))
} }
func TestEncoderTagFieldName(t *testing.T) { func TestEncoderTagFieldName(t *testing.T) {
@@ -873,13 +887,12 @@ func TestEncoderTagFieldName(t *testing.T) {
b, err := toml.Marshal(d) b, err := toml.Marshal(d)
require.NoError(t, err) require.NoError(t, err)
expected := ` expected := `hello = 'world'
hello = 'world'
'#' = '' '#' = ''
Bad = '' Bad = ''
` `
equalStringsIgnoreNewlines(t, expected, string(b)) assert.Equal(t, expected, string(b))
} }
func TestIssue436(t *testing.T) { func TestIssue436(t *testing.T) {
@@ -893,12 +906,11 @@ func TestIssue436(t *testing.T) {
err = toml.NewEncoder(&buf).Encode(v) err = toml.NewEncoder(&buf).Encode(v)
require.NoError(t, err) require.NoError(t, err)
expected := ` expected := `[[a]]
[[a]]
[a.b] [a.b]
c = 'd' c = 'd'
` `
equalStringsIgnoreNewlines(t, expected, buf.String()) assert.Equal(t, expected, buf.String())
} }
func TestIssue424(t *testing.T) { func TestIssue424(t *testing.T) {
@@ -980,7 +992,7 @@ func TestIssue678(t *testing.T) {
out, err := toml.Marshal(cfg) out, err := toml.Marshal(cfg)
require.NoError(t, err) require.NoError(t, err)
equalStringsIgnoreNewlines(t, "BigInt = '123'", string(out)) assert.Equal(t, "BigInt = '123'\n", string(out))
cfg2 := &Config{} cfg2 := &Config{}
err = toml.Unmarshal(out, cfg2) err = toml.Unmarshal(out, cfg2)
@@ -1004,6 +1016,85 @@ func TestIssue752(t *testing.T) {
require.Equal(t, "", string(out)) require.Equal(t, "", string(out))
} }
func TestIssue768(t *testing.T) {
type cfg struct {
Name string `comment:"This is a multiline comment.\nThis is line 2."`
}
out, err := toml.Marshal(&cfg{})
require.NoError(t, err)
expected := `# This is a multiline comment.
# This is line 2.
Name = ''
`
require.Equal(t, expected, string(out))
}
func TestIssue786(t *testing.T) {
type Dependencies struct {
Dependencies []string `toml:"dependencies,multiline,omitempty"`
BuildDependencies []string `toml:"buildDependencies,multiline,omitempty"`
OptionalDependencies []string `toml:"optionalDependencies,multiline,omitempty"`
}
type Test struct {
Dependencies Dependencies `toml:"dependencies,omitempty"`
}
x := Test{}
b, err := toml.Marshal(x)
require.NoError(t, err)
require.Equal(t, "", string(b))
type General struct {
From string `toml:"from,omitempty" json:"from,omitempty" comment:"from in graphite-web format, the local TZ is used"`
Randomize bool `toml:"randomize" json:"randomize" comment:"randomize starting time with [0,step)"`
}
type Custom struct {
Name string `toml:"name" json:"name,omitempty" comment:"names for generator, braces are expanded like in shell"`
Type string `toml:"type,omitempty" json:"type,omitempty" comment:"type of generator"`
General
}
type Config struct {
General
Custom []Custom `toml:"custom,omitempty" json:"custom,omitempty" comment:"generators with custom parameters can be specified separately"`
}
buf := new(bytes.Buffer)
config := &Config{General: General{From: "-2d", Randomize: true}}
config.Custom = []Custom{{Name: "omit", General: General{Randomize: false}}}
config.Custom = append(config.Custom, Custom{Name: "present", General: General{From: "-2d", Randomize: true}})
encoder := toml.NewEncoder(buf)
encoder.Encode(config)
expected := `# from in graphite-web format, the local TZ is used
from = '-2d'
# randomize starting time with [0,step)
randomize = true
# generators with custom parameters can be specified separately
[[custom]]
# names for generator, braces are expanded like in shell
name = 'omit'
# randomize starting time with [0,step)
randomize = false
[[custom]]
# names for generator, braces are expanded like in shell
name = 'present'
# from in graphite-web format, the local TZ is used
from = '-2d'
# randomize starting time with [0,step)
randomize = true
`
require.Equal(t, expected, buf.String())
}
func TestMarshalNestedAnonymousStructs(t *testing.T) { func TestMarshalNestedAnonymousStructs(t *testing.T) {
type Embedded struct { type Embedded struct {
Value string `toml:"value" json:"value"` Value string `toml:"value" json:"value"`
@@ -1025,6 +1116,7 @@ func TestMarshalNestedAnonymousStructs(t *testing.T) {
} }
expected := `value = '' expected := `value = ''
[top] [top]
value = '' value = ''
@@ -1033,7 +1125,6 @@ value = ''
[anonymous] [anonymous]
value = '' value = ''
` `
result, err := toml.Marshal(doc) result, err := toml.Marshal(doc)
@@ -1057,9 +1148,9 @@ func TestMarshalNestedAnonymousStructs_DuplicateField(t *testing.T) {
doc.Value = "shadows" doc.Value = "shadows"
expected := `value = 'shadows' expected := `value = 'shadows'
[top] [top]
value = '' value = ''
` `
result, err := toml.Marshal(doc) result, err := toml.Marshal(doc)
@@ -1070,7 +1161,7 @@ value = ''
func TestLocalTime(t *testing.T) { func TestLocalTime(t *testing.T) {
v := map[string]toml.LocalTime{ v := map[string]toml.LocalTime{
"a": toml.LocalTime{ "a": {
Hour: 1, Hour: 1,
Minute: 2, Minute: 2,
Second: 3, Second: 3,
@@ -1086,6 +1177,40 @@ func TestLocalTime(t *testing.T) {
require.Equal(t, expected, string(out)) require.Equal(t, expected, string(out))
} }
func TestMarshalUint64Overflow(t *testing.T) {
// The TOML spec only requires implementation to provide support for the
// int64 range. To avoid generating TOML documents that would not be
// supported by standard-compliant parsers, uint64 > max int64 cannot be
// marshaled.
x := map[string]interface{}{
"foo": uint64(math.MaxInt64) + 1,
}
_, err := toml.Marshal(x)
require.Error(t, err)
}
func TestIndentWithInlineTable(t *testing.T) {
x := map[string][]map[string]string{
"one": []map[string]string{
{"0": "0"},
{"1": "1"},
},
}
expected := `one = [
{0 = '0'},
{1 = '1'}
]
`
var buf bytes.Buffer
enc := toml.NewEncoder(&buf)
enc.SetIndentTables(true)
enc.SetTablesInline(true)
enc.SetArraysMultiline(true)
require.NoError(t, enc.Encode(x))
assert.Equal(t, expected, buf.String())
}
func ExampleMarshal() { func ExampleMarshal() {
type MyConfig struct { type MyConfig struct {
Version int Version int
+17 -17
View File
@@ -1,9 +1,9 @@
package toml package toml
import ( import (
"github.com/pelletier/go-toml/v2/internal/ast"
"github.com/pelletier/go-toml/v2/internal/danger" "github.com/pelletier/go-toml/v2/internal/danger"
"github.com/pelletier/go-toml/v2/internal/tracker" "github.com/pelletier/go-toml/v2/internal/tracker"
"github.com/pelletier/go-toml/v2/unstable"
) )
type strict struct { type strict struct {
@@ -12,10 +12,10 @@ type strict struct {
// Tracks the current key being processed. // Tracks the current key being processed.
key tracker.KeyTracker key tracker.KeyTracker
missing []decodeError missing []unstable.ParserError
} }
func (s *strict) EnterTable(node *ast.Node) { func (s *strict) EnterTable(node *unstable.Node) {
if !s.Enabled { if !s.Enabled {
return return
} }
@@ -23,7 +23,7 @@ func (s *strict) EnterTable(node *ast.Node) {
s.key.UpdateTable(node) s.key.UpdateTable(node)
} }
func (s *strict) EnterArrayTable(node *ast.Node) { func (s *strict) EnterArrayTable(node *unstable.Node) {
if !s.Enabled { if !s.Enabled {
return return
} }
@@ -31,7 +31,7 @@ func (s *strict) EnterArrayTable(node *ast.Node) {
s.key.UpdateArrayTable(node) s.key.UpdateArrayTable(node)
} }
func (s *strict) EnterKeyValue(node *ast.Node) { func (s *strict) EnterKeyValue(node *unstable.Node) {
if !s.Enabled { if !s.Enabled {
return return
} }
@@ -39,7 +39,7 @@ func (s *strict) EnterKeyValue(node *ast.Node) {
s.key.Push(node) s.key.Push(node)
} }
func (s *strict) ExitKeyValue(node *ast.Node) { func (s *strict) ExitKeyValue(node *unstable.Node) {
if !s.Enabled { if !s.Enabled {
return return
} }
@@ -47,27 +47,27 @@ func (s *strict) ExitKeyValue(node *ast.Node) {
s.key.Pop(node) s.key.Pop(node)
} }
func (s *strict) MissingTable(node *ast.Node) { func (s *strict) MissingTable(node *unstable.Node) {
if !s.Enabled { if !s.Enabled {
return return
} }
s.missing = append(s.missing, decodeError{ s.missing = append(s.missing, unstable.ParserError{
highlight: keyLocation(node), Highlight: keyLocation(node),
message: "missing table", Message: "missing table",
key: s.key.Key(), Key: s.key.Key(),
}) })
} }
func (s *strict) MissingField(node *ast.Node) { func (s *strict) MissingField(node *unstable.Node) {
if !s.Enabled { if !s.Enabled {
return return
} }
s.missing = append(s.missing, decodeError{ s.missing = append(s.missing, unstable.ParserError{
highlight: keyLocation(node), Highlight: keyLocation(node),
message: "missing field", Message: "missing field",
key: s.key.Key(), Key: s.key.Key(),
}) })
} }
@@ -88,7 +88,7 @@ func (s *strict) Error(doc []byte) error {
return err return err
} }
func keyLocation(node *ast.Node) []byte { func keyLocation(node *unstable.Node) []byte {
k := node.Key() k := node.Key()
hasOne := k.Next() hasOne := k.Next()
+5 -5
View File
@@ -6,9 +6,9 @@ import (
"time" "time"
) )
var timeType = reflect.TypeOf(time.Time{}) var timeType = reflect.TypeOf((*time.Time)(nil)).Elem()
var textMarshalerType = reflect.TypeOf(new(encoding.TextMarshaler)).Elem() var textMarshalerType = reflect.TypeOf((*encoding.TextMarshaler)(nil)).Elem()
var textUnmarshalerType = reflect.TypeOf(new(encoding.TextUnmarshaler)).Elem() var textUnmarshalerType = reflect.TypeOf((*encoding.TextUnmarshaler)(nil)).Elem()
var mapStringInterfaceType = reflect.TypeOf(map[string]interface{}{}) var mapStringInterfaceType = reflect.TypeOf(map[string]interface{}(nil))
var sliceInterfaceType = reflect.TypeOf([]interface{}{}) var sliceInterfaceType = reflect.TypeOf([]interface{}(nil))
var stringType = reflect.TypeOf("") var stringType = reflect.TypeOf("")
+134 -97
View File
@@ -12,16 +12,16 @@ import (
"sync/atomic" "sync/atomic"
"time" "time"
"github.com/pelletier/go-toml/v2/internal/ast"
"github.com/pelletier/go-toml/v2/internal/danger" "github.com/pelletier/go-toml/v2/internal/danger"
"github.com/pelletier/go-toml/v2/internal/tracker" "github.com/pelletier/go-toml/v2/internal/tracker"
"github.com/pelletier/go-toml/v2/unstable"
) )
// Unmarshal deserializes a TOML document into a Go value. // Unmarshal deserializes a TOML document into a Go value.
// //
// It is a shortcut for Decoder.Decode() with the default options. // It is a shortcut for Decoder.Decode() with the default options.
func Unmarshal(data []byte, v interface{}) error { func Unmarshal(data []byte, v interface{}) error {
p := parser{} p := unstable.Parser{}
p.Reset(data) p.Reset(data)
d := decoder{p: &p} d := decoder{p: &p}
@@ -60,7 +60,7 @@ func (d *Decoder) DisallowUnknownFields() *Decoder {
// are ignored. See Decoder.DisallowUnknownFields() to change this behavior. // are ignored. See Decoder.DisallowUnknownFields() to change this behavior.
// //
// When a TOML local date, time, or date-time is decoded into a time.Time, its // When a TOML local date, time, or date-time is decoded into a time.Time, its
// value is represented in time.Local timezone. Otherwise the approriate Local* // value is represented in time.Local timezone. Otherwise the appropriate Local*
// structure is used. For time values, precision up to the nanosecond is // structure is used. For time values, precision up to the nanosecond is
// supported by truncating extra digits. // supported by truncating extra digits.
// //
@@ -79,29 +79,29 @@ func (d *Decoder) DisallowUnknownFields() *Decoder {
// strict mode and a field is missing, a `toml.StrictMissingError` is // strict mode and a field is missing, a `toml.StrictMissingError` is
// returned. In any other case, this function returns a standard Go error. // returned. In any other case, this function returns a standard Go error.
// //
// Type mapping // # Type mapping
// //
// List of supported TOML types and their associated accepted Go types: // List of supported TOML types and their associated accepted Go types:
// //
// String -> string // String -> string
// Integer -> uint*, int*, depending on size // Integer -> uint*, int*, depending on size
// Float -> float*, depending on size // Float -> float*, depending on size
// Boolean -> bool // Boolean -> bool
// Offset Date-Time -> time.Time // Offset Date-Time -> time.Time
// Local Date-time -> LocalDateTime, time.Time // Local Date-time -> LocalDateTime, time.Time
// Local Date -> LocalDate, time.Time // Local Date -> LocalDate, time.Time
// Local Time -> LocalTime, time.Time // Local Time -> LocalTime, time.Time
// Array -> slice and array, depending on elements types // Array -> slice and array, depending on elements types
// Table -> map and struct // Table -> map and struct
// Inline Table -> same as Table // Inline Table -> same as Table
// Array of Tables -> same as Array and Table // Array of Tables -> same as Array and Table
func (d *Decoder) Decode(v interface{}) error { func (d *Decoder) Decode(v interface{}) error {
b, err := ioutil.ReadAll(d.r) b, err := ioutil.ReadAll(d.r)
if err != nil { if err != nil {
return fmt.Errorf("toml: %w", err) return fmt.Errorf("toml: %w", err)
} }
p := parser{} p := unstable.Parser{}
p.Reset(b) p.Reset(b)
dec := decoder{ dec := decoder{
p: &p, p: &p,
@@ -115,7 +115,7 @@ func (d *Decoder) Decode(v interface{}) error {
type decoder struct { type decoder struct {
// Which parser instance in use for this decoding session. // Which parser instance in use for this decoding session.
p *parser p *unstable.Parser
// Flag indicating that the current expression is stashed. // Flag indicating that the current expression is stashed.
// If set to true, calling nextExpr will not actually pull a new expression // If set to true, calling nextExpr will not actually pull a new expression
@@ -123,7 +123,7 @@ type decoder struct {
stashedExpr bool stashedExpr bool
// Skip expressions until a table is found. This is set to true when a // Skip expressions until a table is found. This is set to true when a
// table could not be create (missing field in map), so all KV expressions // table could not be created (missing field in map), so all KV expressions
// need to be skipped. // need to be skipped.
skipUntilTable bool skipUntilTable bool
@@ -157,7 +157,7 @@ func (d *decoder) typeMismatchError(toml string, target reflect.Type) error {
return fmt.Errorf("toml: cannot decode TOML %s into a Go value of type %s", toml, target) return fmt.Errorf("toml: cannot decode TOML %s into a Go value of type %s", toml, target)
} }
func (d *decoder) expr() *ast.Node { func (d *decoder) expr() *unstable.Node {
return d.p.Expression() return d.p.Expression()
} }
@@ -208,12 +208,12 @@ func (d *decoder) FromParser(v interface{}) error {
err := d.fromParser(r) err := d.fromParser(r)
if err == nil { if err == nil {
return d.strict.Error(d.p.data) return d.strict.Error(d.p.Data())
} }
var e *decodeError var e *unstable.ParserError
if errors.As(err, &e) { if errors.As(err, &e) {
return wrapDecodeError(d.p.data, e) return wrapDecodeError(d.p.Data(), e)
} }
return err return err
@@ -234,16 +234,16 @@ func (d *decoder) fromParser(root reflect.Value) error {
Rules for the unmarshal code: Rules for the unmarshal code:
- The stack is used to keep track of which values need to be set where. - The stack is used to keep track of which values need to be set where.
- handle* functions <=> switch on a given ast.Kind. - handle* functions <=> switch on a given unstable.Kind.
- unmarshalX* functions need to unmarshal a node of kind X. - unmarshalX* functions need to unmarshal a node of kind X.
- An "object" is either a struct or a map. - An "object" is either a struct or a map.
*/ */
func (d *decoder) handleRootExpression(expr *ast.Node, v reflect.Value) error { func (d *decoder) handleRootExpression(expr *unstable.Node, v reflect.Value) error {
var x reflect.Value var x reflect.Value
var err error var err error
if !(d.skipUntilTable && expr.Kind == ast.KeyValue) { if !(d.skipUntilTable && expr.Kind == unstable.KeyValue) {
err = d.seen.CheckExpression(expr) err = d.seen.CheckExpression(expr)
if err != nil { if err != nil {
return err return err
@@ -251,16 +251,16 @@ func (d *decoder) handleRootExpression(expr *ast.Node, v reflect.Value) error {
} }
switch expr.Kind { switch expr.Kind {
case ast.KeyValue: case unstable.KeyValue:
if d.skipUntilTable { if d.skipUntilTable {
return nil return nil
} }
x, err = d.handleKeyValue(expr, v) x, err = d.handleKeyValue(expr, v)
case ast.Table: case unstable.Table:
d.skipUntilTable = false d.skipUntilTable = false
d.strict.EnterTable(expr) d.strict.EnterTable(expr)
x, err = d.handleTable(expr.Key(), v) x, err = d.handleTable(expr.Key(), v)
case ast.ArrayTable: case unstable.ArrayTable:
d.skipUntilTable = false d.skipUntilTable = false
d.strict.EnterArrayTable(expr) d.strict.EnterArrayTable(expr)
x, err = d.handleArrayTable(expr.Key(), v) x, err = d.handleArrayTable(expr.Key(), v)
@@ -269,7 +269,7 @@ func (d *decoder) handleRootExpression(expr *ast.Node, v reflect.Value) error {
} }
if d.skipUntilTable { if d.skipUntilTable {
if expr.Kind == ast.Table || expr.Kind == ast.ArrayTable { if expr.Kind == unstable.Table || expr.Kind == unstable.ArrayTable {
d.strict.MissingTable(expr) d.strict.MissingTable(expr)
} }
} else if err == nil && x.IsValid() { } else if err == nil && x.IsValid() {
@@ -279,14 +279,14 @@ func (d *decoder) handleRootExpression(expr *ast.Node, v reflect.Value) error {
return err return err
} }
func (d *decoder) handleArrayTable(key ast.Iterator, v reflect.Value) (reflect.Value, error) { func (d *decoder) handleArrayTable(key unstable.Iterator, v reflect.Value) (reflect.Value, error) {
if key.Next() { if key.Next() {
return d.handleArrayTablePart(key, v) return d.handleArrayTablePart(key, v)
} }
return d.handleKeyValues(v) return d.handleKeyValues(v)
} }
func (d *decoder) handleArrayTableCollectionLast(key ast.Iterator, v reflect.Value) (reflect.Value, error) { func (d *decoder) handleArrayTableCollectionLast(key unstable.Iterator, v reflect.Value) (reflect.Value, error) {
switch v.Kind() { switch v.Kind() {
case reflect.Interface: case reflect.Interface:
elem := v.Elem() elem := v.Elem()
@@ -339,21 +339,21 @@ func (d *decoder) handleArrayTableCollectionLast(key ast.Iterator, v reflect.Val
case reflect.Array: case reflect.Array:
idx := d.arrayIndex(true, v) idx := d.arrayIndex(true, v)
if idx >= v.Len() { if idx >= v.Len() {
return v, fmt.Errorf("toml: cannot decode array table into %s at position %d", v.Type(), idx) return v, fmt.Errorf("%s at position %d", d.typeMismatchError("array table", v.Type()), idx)
} }
elem := v.Index(idx) elem := v.Index(idx)
_, err := d.handleArrayTable(key, elem) _, err := d.handleArrayTable(key, elem)
return v, err return v, err
default:
return reflect.Value{}, d.typeMismatchError("array table", v.Type())
} }
return d.handleArrayTable(key, v)
} }
// When parsing an array table expression, each part of the key needs to be // When parsing an array table expression, each part of the key needs to be
// evaluated like a normal key, but if it returns a collection, it also needs to // evaluated like a normal key, but if it returns a collection, it also needs to
// point to the last element of the collection. Unless it is the last part of // point to the last element of the collection. Unless it is the last part of
// the key, then it needs to create a new element at the end. // the key, then it needs to create a new element at the end.
func (d *decoder) handleArrayTableCollection(key ast.Iterator, v reflect.Value) (reflect.Value, error) { func (d *decoder) handleArrayTableCollection(key unstable.Iterator, v reflect.Value) (reflect.Value, error) {
if key.IsLast() { if key.IsLast() {
return d.handleArrayTableCollectionLast(key, v) return d.handleArrayTableCollectionLast(key, v)
} }
@@ -390,7 +390,7 @@ func (d *decoder) handleArrayTableCollection(key ast.Iterator, v reflect.Value)
case reflect.Array: case reflect.Array:
idx := d.arrayIndex(false, v) idx := d.arrayIndex(false, v)
if idx >= v.Len() { if idx >= v.Len() {
return v, fmt.Errorf("toml: cannot decode array table into %s at position %d", v.Type(), idx) return v, fmt.Errorf("%s at position %d", d.typeMismatchError("array table", v.Type()), idx)
} }
elem := v.Index(idx) elem := v.Index(idx)
_, err := d.handleArrayTable(key, elem) _, err := d.handleArrayTable(key, elem)
@@ -400,7 +400,7 @@ func (d *decoder) handleArrayTableCollection(key ast.Iterator, v reflect.Value)
return d.handleArrayTable(key, v) return d.handleArrayTable(key, v)
} }
func (d *decoder) handleKeyPart(key ast.Iterator, v reflect.Value, nextFn handlerFn, makeFn valueMakerFn) (reflect.Value, error) { func (d *decoder) handleKeyPart(key unstable.Iterator, v reflect.Value, nextFn handlerFn, makeFn valueMakerFn) (reflect.Value, error) {
var rv reflect.Value var rv reflect.Value
// First, dispatch over v to make sure it is a valid object. // First, dispatch over v to make sure it is a valid object.
@@ -483,7 +483,7 @@ func (d *decoder) handleKeyPart(key ast.Iterator, v reflect.Value, nextFn handle
d.errorContext.Struct = t d.errorContext.Struct = t
d.errorContext.Field = path d.errorContext.Field = path
f := v.FieldByIndex(path) f := fieldByIndex(v, path)
x, err := nextFn(key, f) x, err := nextFn(key, f)
if err != nil || d.skipUntilTable { if err != nil || d.skipUntilTable {
return reflect.Value{}, err return reflect.Value{}, err
@@ -518,7 +518,7 @@ func (d *decoder) handleKeyPart(key ast.Iterator, v reflect.Value, nextFn handle
// HandleArrayTablePart navigates the Go structure v using the key v. It is // HandleArrayTablePart navigates the Go structure v using the key v. It is
// only used for the prefix (non-last) parts of an array-table. When // only used for the prefix (non-last) parts of an array-table. When
// encountering a collection, it should go to the last element. // encountering a collection, it should go to the last element.
func (d *decoder) handleArrayTablePart(key ast.Iterator, v reflect.Value) (reflect.Value, error) { func (d *decoder) handleArrayTablePart(key unstable.Iterator, v reflect.Value) (reflect.Value, error) {
var makeFn valueMakerFn var makeFn valueMakerFn
if key.IsLast() { if key.IsLast() {
makeFn = makeSliceInterface makeFn = makeSliceInterface
@@ -530,10 +530,10 @@ func (d *decoder) handleArrayTablePart(key ast.Iterator, v reflect.Value) (refle
// HandleTable returns a reference when it has checked the next expression but // HandleTable returns a reference when it has checked the next expression but
// cannot handle it. // cannot handle it.
func (d *decoder) handleTable(key ast.Iterator, v reflect.Value) (reflect.Value, error) { func (d *decoder) handleTable(key unstable.Iterator, v reflect.Value) (reflect.Value, error) {
if v.Kind() == reflect.Slice { if v.Kind() == reflect.Slice {
if v.Len() == 0 { if v.Len() == 0 {
return reflect.Value{}, newDecodeError(key.Node().Data, "cannot store a table in a slice") return reflect.Value{}, unstable.NewParserError(key.Node().Data, "cannot store a table in a slice")
} }
elem := v.Index(v.Len() - 1) elem := v.Index(v.Len() - 1)
x, err := d.handleTable(key, elem) x, err := d.handleTable(key, elem)
@@ -560,7 +560,7 @@ func (d *decoder) handleKeyValues(v reflect.Value) (reflect.Value, error) {
var rv reflect.Value var rv reflect.Value
for d.nextExpr() { for d.nextExpr() {
expr := d.expr() expr := d.expr()
if expr.Kind != ast.KeyValue { if expr.Kind != unstable.KeyValue {
// Stash the expression so that fromParser can just loop and use // Stash the expression so that fromParser can just loop and use
// the right handler. // the right handler.
// We could just recurse ourselves here, but at least this gives a // We could just recurse ourselves here, but at least this gives a
@@ -587,7 +587,7 @@ func (d *decoder) handleKeyValues(v reflect.Value) (reflect.Value, error) {
} }
type ( type (
handlerFn func(key ast.Iterator, v reflect.Value) (reflect.Value, error) handlerFn func(key unstable.Iterator, v reflect.Value) (reflect.Value, error)
valueMakerFn func() reflect.Value valueMakerFn func() reflect.Value
) )
@@ -599,11 +599,11 @@ func makeSliceInterface() reflect.Value {
return reflect.MakeSlice(sliceInterfaceType, 0, 16) return reflect.MakeSlice(sliceInterfaceType, 0, 16)
} }
func (d *decoder) handleTablePart(key ast.Iterator, v reflect.Value) (reflect.Value, error) { func (d *decoder) handleTablePart(key unstable.Iterator, v reflect.Value) (reflect.Value, error) {
return d.handleKeyPart(key, v, d.handleTable, makeMapStringInterface) return d.handleKeyPart(key, v, d.handleTable, makeMapStringInterface)
} }
func (d *decoder) tryTextUnmarshaler(node *ast.Node, v reflect.Value) (bool, error) { func (d *decoder) tryTextUnmarshaler(node *unstable.Node, v reflect.Value) (bool, error) {
// Special case for time, because we allow to unmarshal to it from // Special case for time, because we allow to unmarshal to it from
// different kind of AST nodes. // different kind of AST nodes.
if v.Type() == timeType { if v.Type() == timeType {
@@ -613,7 +613,7 @@ func (d *decoder) tryTextUnmarshaler(node *ast.Node, v reflect.Value) (bool, err
if v.CanAddr() && v.Addr().Type().Implements(textUnmarshalerType) { if v.CanAddr() && v.Addr().Type().Implements(textUnmarshalerType) {
err := v.Addr().Interface().(encoding.TextUnmarshaler).UnmarshalText(node.Data) err := v.Addr().Interface().(encoding.TextUnmarshaler).UnmarshalText(node.Data)
if err != nil { if err != nil {
return false, newDecodeError(d.p.Raw(node.Raw), "%w", err) return false, unstable.NewParserError(d.p.Raw(node.Raw), "%w", err)
} }
return true, nil return true, nil
@@ -622,7 +622,7 @@ func (d *decoder) tryTextUnmarshaler(node *ast.Node, v reflect.Value) (bool, err
return false, nil return false, nil
} }
func (d *decoder) handleValue(value *ast.Node, v reflect.Value) error { func (d *decoder) handleValue(value *unstable.Node, v reflect.Value) error {
for v.Kind() == reflect.Ptr { for v.Kind() == reflect.Ptr {
v = initAndDereferencePointer(v) v = initAndDereferencePointer(v)
} }
@@ -633,32 +633,32 @@ func (d *decoder) handleValue(value *ast.Node, v reflect.Value) error {
} }
switch value.Kind { switch value.Kind {
case ast.String: case unstable.String:
return d.unmarshalString(value, v) return d.unmarshalString(value, v)
case ast.Integer: case unstable.Integer:
return d.unmarshalInteger(value, v) return d.unmarshalInteger(value, v)
case ast.Float: case unstable.Float:
return d.unmarshalFloat(value, v) return d.unmarshalFloat(value, v)
case ast.Bool: case unstable.Bool:
return d.unmarshalBool(value, v) return d.unmarshalBool(value, v)
case ast.DateTime: case unstable.DateTime:
return d.unmarshalDateTime(value, v) return d.unmarshalDateTime(value, v)
case ast.LocalDate: case unstable.LocalDate:
return d.unmarshalLocalDate(value, v) return d.unmarshalLocalDate(value, v)
case ast.LocalTime: case unstable.LocalTime:
return d.unmarshalLocalTime(value, v) return d.unmarshalLocalTime(value, v)
case ast.LocalDateTime: case unstable.LocalDateTime:
return d.unmarshalLocalDateTime(value, v) return d.unmarshalLocalDateTime(value, v)
case ast.InlineTable: case unstable.InlineTable:
return d.unmarshalInlineTable(value, v) return d.unmarshalInlineTable(value, v)
case ast.Array: case unstable.Array:
return d.unmarshalArray(value, v) return d.unmarshalArray(value, v)
default: default:
panic(fmt.Errorf("handleValue not implemented for %s", value.Kind)) panic(fmt.Errorf("handleValue not implemented for %s", value.Kind))
} }
} }
func (d *decoder) unmarshalArray(array *ast.Node, v reflect.Value) error { func (d *decoder) unmarshalArray(array *unstable.Node, v reflect.Value) error {
switch v.Kind() { switch v.Kind() {
case reflect.Slice: case reflect.Slice:
if v.IsNil() { if v.IsNil() {
@@ -729,7 +729,7 @@ func (d *decoder) unmarshalArray(array *ast.Node, v reflect.Value) error {
return nil return nil
} }
func (d *decoder) unmarshalInlineTable(itable *ast.Node, v reflect.Value) error { func (d *decoder) unmarshalInlineTable(itable *unstable.Node, v reflect.Value) error {
// Make sure v is an initialized object. // Make sure v is an initialized object.
switch v.Kind() { switch v.Kind() {
case reflect.Map: case reflect.Map:
@@ -746,7 +746,7 @@ func (d *decoder) unmarshalInlineTable(itable *ast.Node, v reflect.Value) error
} }
return d.unmarshalInlineTable(itable, elem) return d.unmarshalInlineTable(itable, elem)
default: default:
return newDecodeError(itable.Data, "cannot store inline table in Go type %s", v.Kind()) return unstable.NewParserError(d.p.Raw(itable.Raw), "cannot store inline table in Go type %s", v.Kind())
} }
it := itable.Children() it := itable.Children()
@@ -765,7 +765,7 @@ func (d *decoder) unmarshalInlineTable(itable *ast.Node, v reflect.Value) error
return nil return nil
} }
func (d *decoder) unmarshalDateTime(value *ast.Node, v reflect.Value) error { func (d *decoder) unmarshalDateTime(value *unstable.Node, v reflect.Value) error {
dt, err := parseDateTime(value.Data) dt, err := parseDateTime(value.Data)
if err != nil { if err != nil {
return err return err
@@ -775,7 +775,7 @@ func (d *decoder) unmarshalDateTime(value *ast.Node, v reflect.Value) error {
return nil return nil
} }
func (d *decoder) unmarshalLocalDate(value *ast.Node, v reflect.Value) error { func (d *decoder) unmarshalLocalDate(value *unstable.Node, v reflect.Value) error {
ld, err := parseLocalDate(value.Data) ld, err := parseLocalDate(value.Data)
if err != nil { if err != nil {
return err return err
@@ -792,28 +792,28 @@ func (d *decoder) unmarshalLocalDate(value *ast.Node, v reflect.Value) error {
return nil return nil
} }
func (d *decoder) unmarshalLocalTime(value *ast.Node, v reflect.Value) error { func (d *decoder) unmarshalLocalTime(value *unstable.Node, v reflect.Value) error {
lt, rest, err := parseLocalTime(value.Data) lt, rest, err := parseLocalTime(value.Data)
if err != nil { if err != nil {
return err return err
} }
if len(rest) > 0 { if len(rest) > 0 {
return newDecodeError(rest, "extra characters at the end of a local time") return unstable.NewParserError(rest, "extra characters at the end of a local time")
} }
v.Set(reflect.ValueOf(lt)) v.Set(reflect.ValueOf(lt))
return nil return nil
} }
func (d *decoder) unmarshalLocalDateTime(value *ast.Node, v reflect.Value) error { func (d *decoder) unmarshalLocalDateTime(value *unstable.Node, v reflect.Value) error {
ldt, rest, err := parseLocalDateTime(value.Data) ldt, rest, err := parseLocalDateTime(value.Data)
if err != nil { if err != nil {
return err return err
} }
if len(rest) > 0 { if len(rest) > 0 {
return newDecodeError(rest, "extra characters at the end of a local date time") return unstable.NewParserError(rest, "extra characters at the end of a local date time")
} }
if v.Type() == timeType { if v.Type() == timeType {
@@ -828,7 +828,7 @@ func (d *decoder) unmarshalLocalDateTime(value *ast.Node, v reflect.Value) error
return nil return nil
} }
func (d *decoder) unmarshalBool(value *ast.Node, v reflect.Value) error { func (d *decoder) unmarshalBool(value *unstable.Node, v reflect.Value) error {
b := value.Data[0] == 't' b := value.Data[0] == 't'
switch v.Kind() { switch v.Kind() {
@@ -837,13 +837,13 @@ func (d *decoder) unmarshalBool(value *ast.Node, v reflect.Value) error {
case reflect.Interface: case reflect.Interface:
v.Set(reflect.ValueOf(b)) v.Set(reflect.ValueOf(b))
default: default:
return newDecodeError(value.Data, "cannot assign boolean to a %t", b) return unstable.NewParserError(value.Data, "cannot assign boolean to a %t", b)
} }
return nil return nil
} }
func (d *decoder) unmarshalFloat(value *ast.Node, v reflect.Value) error { func (d *decoder) unmarshalFloat(value *unstable.Node, v reflect.Value) error {
f, err := parseFloat(value.Data) f, err := parseFloat(value.Data)
if err != nil { if err != nil {
return err return err
@@ -854,23 +854,43 @@ func (d *decoder) unmarshalFloat(value *ast.Node, v reflect.Value) error {
v.SetFloat(f) v.SetFloat(f)
case reflect.Float32: case reflect.Float32:
if f > math.MaxFloat32 { if f > math.MaxFloat32 {
return newDecodeError(value.Data, "number %f does not fit in a float32", f) return unstable.NewParserError(value.Data, "number %f does not fit in a float32", f)
} }
v.SetFloat(f) v.SetFloat(f)
case reflect.Interface: case reflect.Interface:
v.Set(reflect.ValueOf(f)) v.Set(reflect.ValueOf(f))
default: default:
return newDecodeError(value.Data, "float cannot be assigned to %s", v.Kind()) return unstable.NewParserError(value.Data, "float cannot be assigned to %s", v.Kind())
} }
return nil return nil
} }
func (d *decoder) unmarshalInteger(value *ast.Node, v reflect.Value) error { const (
const ( maxInt = int64(^uint(0) >> 1)
maxInt = int64(^uint(0) >> 1) minInt = -maxInt - 1
minInt = -maxInt - 1 )
)
// Maximum value of uint for decoding. Currently the decoder parses the integer
// into an int64. As a result, on architectures where uint is 64 bits, the
// effective maximum uint we can decode is the maximum of int64. On
// architectures where uint is 32 bits, the maximum value we can decode is
// lower: the maximum of uint32. I didn't find a way to figure out this value at
// compile time, so it is computed during initialization.
var maxUint int64 = math.MaxInt64
func init() {
m := uint64(^uint(0))
if m < uint64(maxUint) {
maxUint = int64(m)
}
}
func (d *decoder) unmarshalInteger(value *unstable.Node, v reflect.Value) error {
kind := v.Kind()
if kind == reflect.Float32 || kind == reflect.Float64 {
return d.unmarshalFloat(value, v)
}
i, err := parseInteger(value.Data) i, err := parseInteger(value.Data)
if err != nil { if err != nil {
@@ -879,7 +899,7 @@ func (d *decoder) unmarshalInteger(value *ast.Node, v reflect.Value) error {
var r reflect.Value var r reflect.Value
switch v.Kind() { switch kind {
case reflect.Int64: case reflect.Int64:
v.SetInt(i) v.SetInt(i)
return nil return nil
@@ -932,7 +952,7 @@ func (d *decoder) unmarshalInteger(value *ast.Node, v reflect.Value) error {
r = reflect.ValueOf(uint8(i)) r = reflect.ValueOf(uint8(i))
case reflect.Uint: case reflect.Uint:
if i < 0 { if i < 0 || i > maxUint {
return fmt.Errorf("toml: negative number %d does not fit in an uint", i) return fmt.Errorf("toml: negative number %d does not fit in an uint", i)
} }
@@ -952,20 +972,20 @@ func (d *decoder) unmarshalInteger(value *ast.Node, v reflect.Value) error {
return nil return nil
} }
func (d *decoder) unmarshalString(value *ast.Node, v reflect.Value) error { func (d *decoder) unmarshalString(value *unstable.Node, v reflect.Value) error {
switch v.Kind() { switch v.Kind() {
case reflect.String: case reflect.String:
v.SetString(string(value.Data)) v.SetString(string(value.Data))
case reflect.Interface: case reflect.Interface:
v.Set(reflect.ValueOf(string(value.Data))) v.Set(reflect.ValueOf(string(value.Data)))
default: default:
return newDecodeError(d.p.Raw(value.Raw), "cannot store TOML string into a Go %s", v.Kind()) return unstable.NewParserError(d.p.Raw(value.Raw), "cannot store TOML string into a Go %s", v.Kind())
} }
return nil return nil
} }
func (d *decoder) handleKeyValue(expr *ast.Node, v reflect.Value) (reflect.Value, error) { func (d *decoder) handleKeyValue(expr *unstable.Node, v reflect.Value) (reflect.Value, error) {
d.strict.EnterKeyValue(expr) d.strict.EnterKeyValue(expr)
v, err := d.handleKeyValueInner(expr.Key(), expr.Value(), v) v, err := d.handleKeyValueInner(expr.Key(), expr.Value(), v)
@@ -979,7 +999,7 @@ func (d *decoder) handleKeyValue(expr *ast.Node, v reflect.Value) (reflect.Value
return v, err return v, err
} }
func (d *decoder) handleKeyValueInner(key ast.Iterator, value *ast.Node, v reflect.Value) (reflect.Value, error) { func (d *decoder) handleKeyValueInner(key unstable.Iterator, value *unstable.Node, v reflect.Value) (reflect.Value, error) {
if key.Next() { if key.Next() {
// Still scoping the key // Still scoping the key
return d.handleKeyValuePart(key, value, v) return d.handleKeyValuePart(key, value, v)
@@ -989,7 +1009,7 @@ func (d *decoder) handleKeyValueInner(key ast.Iterator, value *ast.Node, v refle
return reflect.Value{}, d.handleValue(value, v) return reflect.Value{}, d.handleValue(value, v)
} }
func (d *decoder) handleKeyValuePart(key ast.Iterator, value *ast.Node, v reflect.Value) (reflect.Value, error) { func (d *decoder) handleKeyValuePart(key unstable.Iterator, value *unstable.Node, v reflect.Value) (reflect.Value, error) {
// contains the replacement for v // contains the replacement for v
var rv reflect.Value var rv reflect.Value
@@ -1019,15 +1039,9 @@ func (d *decoder) handleKeyValuePart(key ast.Iterator, value *ast.Node, v reflec
mv := v.MapIndex(mk) mv := v.MapIndex(mk)
set := false set := false
if !mv.IsValid() { if !mv.IsValid() || key.IsLast() {
set = true set = true
mv = reflect.New(v.Type().Elem()).Elem() mv = reflect.New(v.Type().Elem()).Elem()
} else {
if key.IsLast() {
var x interface{}
mv = reflect.ValueOf(&x).Elem()
set = true
}
} }
nv, err := d.handleKeyValueInner(key, value, mv) nv, err := d.handleKeyValueInner(key, value, mv)
@@ -1056,7 +1070,7 @@ func (d *decoder) handleKeyValuePart(key ast.Iterator, value *ast.Node, v reflec
d.errorContext.Struct = t d.errorContext.Struct = t
d.errorContext.Field = path d.errorContext.Field = path
f := v.FieldByIndex(path) f := fieldByIndex(v, path)
x, err := d.handleKeyValueInner(key, value, f) x, err := d.handleKeyValueInner(key, value, f)
if err != nil { if err != nil {
return reflect.Value{}, err return reflect.Value{}, err
@@ -1120,6 +1134,21 @@ func initAndDereferencePointer(v reflect.Value) reflect.Value {
return elem return elem
} }
// Same as reflect.Value.FieldByIndex, but creates pointers if needed.
func fieldByIndex(v reflect.Value, path []int) reflect.Value {
for i, x := range path {
v = v.Field(x)
if i < len(path)-1 && v.Kind() == reflect.Ptr {
if v.IsNil() {
v.Set(reflect.New(v.Type().Elem()))
}
v = v.Elem()
}
}
return v
}
type fieldPathsMap = map[string][]int type fieldPathsMap = map[string][]int
var globalFieldPathsCache atomic.Value // map[danger.TypeID]fieldPathsMap var globalFieldPathsCache atomic.Value // map[danger.TypeID]fieldPathsMap
@@ -1167,11 +1196,6 @@ func forEachField(t reflect.Type, path []int, do func(name string, path []int))
fieldPath := append(path, i) fieldPath := append(path, i)
fieldPath = fieldPath[:len(fieldPath):len(fieldPath)] fieldPath = fieldPath[:len(fieldPath):len(fieldPath)]
if f.Anonymous {
forEachField(f.Type, fieldPath, do)
continue
}
name := f.Tag.Get("toml") name := f.Tag.Get("toml")
if name == "-" { if name == "-" {
continue continue
@@ -1180,6 +1204,19 @@ func forEachField(t reflect.Type, path []int, do func(name string, path []int))
if i := strings.IndexByte(name, ','); i >= 0 { if i := strings.IndexByte(name, ','); i >= 0 {
name = name[:i] name = name[:i]
} }
if f.Anonymous && name == "" {
t2 := f.Type
if t2.Kind() == reflect.Ptr {
t2 = t2.Elem()
}
if t2.Kind() == reflect.Struct {
forEachField(t2, fieldPath, do)
}
continue
}
if name == "" { if name == "" {
name = f.Name name = f.Name
} }
+171 -21
View File
@@ -1735,6 +1735,28 @@ B = "data"`,
} }
}, },
}, },
{
desc: "kv that points to a slice",
input: "a.b.c = 'foo'",
gen: func() test {
doc := map[string][]string{}
return test{
target: &doc,
err: true,
}
},
},
{
desc: "kv that points to a pointer to a slice",
input: "a.b.c = 'foo'",
gen: func() test {
doc := map[string]*[]string{}
return test{
target: &doc,
err: true,
}
},
},
} }
for _, e := range examples { for _, e := range examples {
@@ -1876,8 +1898,7 @@ key2 = "missing2"
key3 = "missing3" key3 = "missing3"
key4 = "value4" key4 = "value4"
`, `,
expected: ` expected: `2| key1 = "value1"
2| key1 = "value1"
3| key2 = "missing2" 3| key2 = "missing2"
| ~~~~ missing field | ~~~~ missing field
4| key3 = "missing3" 4| key3 = "missing3"
@@ -1887,8 +1908,7 @@ key4 = "value4"
3| key2 = "missing2" 3| key2 = "missing2"
4| key3 = "missing3" 4| key3 = "missing3"
| ~~~~ missing field | ~~~~ missing field
5| key4 = "value4" 5| key4 = "value4"`,
`,
target: &struct { target: &struct {
Key1 string Key1 string
Key4 string Key4 string
@@ -1897,10 +1917,8 @@ key4 = "value4"
{ {
desc: "multi-part key", desc: "multi-part key",
input: `a.short.key="foo"`, input: `a.short.key="foo"`,
expected: ` expected: `1| a.short.key="foo"
1| a.short.key="foo" | ~~~~~~~~~~~ missing field`,
| ~~~~~~~~~~~ missing field
`,
}, },
{ {
desc: "missing table", desc: "missing table",
@@ -1908,24 +1926,19 @@ key4 = "value4"
[foo] [foo]
bar = 42 bar = 42
`, `,
expected: ` expected: `2| [foo]
2| [foo]
| ~~~ missing table | ~~~ missing table
3| bar = 42 3| bar = 42`,
`,
}, },
{ {
desc: "missing array table", desc: "missing array table",
input: ` input: `
[[foo]] [[foo]]
bar = 42 bar = 42`,
`, expected: `2| [[foo]]
expected: `
2| [[foo]]
| ~~~ missing table | ~~~ missing table
3| bar = 42 3| bar = 42`,
`,
}, },
} }
@@ -1944,7 +1957,7 @@ bar = 42
var tsm *toml.StrictMissingError var tsm *toml.StrictMissingError
if errors.As(err, &tsm) { if errors.As(err, &tsm) {
equalStringsIgnoreNewlines(t, e.expected, tsm.String()) assert.Equal(t, e.expected, tsm.String())
} else { } else {
t.Fatalf("err should have been a *toml.StrictMissingError, but got %s (%T)", err, err) t.Fatalf("err should have been a *toml.StrictMissingError, but got %s (%T)", err, err)
} }
@@ -2380,6 +2393,100 @@ func TestIssue714(t *testing.T) {
require.Error(t, err) require.Error(t, err)
} }
func TestIssue772(t *testing.T) {
type FileHandling struct {
FilePattern string `toml:"pattern"`
}
type Config struct {
FileHandling `toml:"filehandling"`
}
var defaultConfigFile = []byte(`
[filehandling]
pattern = "reach-masterdev-"`)
config := Config{}
err := toml.Unmarshal(defaultConfigFile, &config)
require.NoError(t, err)
require.Equal(t, "reach-masterdev-", config.FileHandling.FilePattern)
}
func TestIssue774(t *testing.T) {
type ScpData struct {
Host string `json:"host"`
}
type GenConfig struct {
SCP []ScpData `toml:"scp" comment:"Array of Secure Copy Configurations"`
}
c := &GenConfig{}
c.SCP = []ScpData{{Host: "main.domain.com"}}
b, err := toml.Marshal(c)
require.NoError(t, err)
expected := `# Array of Secure Copy Configurations
[[scp]]
Host = 'main.domain.com'
`
require.Equal(t, expected, string(b))
}
func TestIssue799(t *testing.T) {
const testTOML = `
# notice the double brackets
[[test]]
answer = 42
`
var s struct {
// should be []map[string]int
Test map[string]int `toml:"test"`
}
err := toml.Unmarshal([]byte(testTOML), &s)
require.Error(t, err)
}
func TestIssue807(t *testing.T) {
type A struct {
Name string `toml:"name"`
}
type M struct {
*A
}
var m M
err := toml.Unmarshal([]byte(`name = 'foo'`), &m)
require.NoError(t, err)
require.Equal(t, "foo", m.Name)
}
func TestIssue850(t *testing.T) {
data := make(map[string]string)
err := toml.Unmarshal([]byte("foo = {}"), &data)
require.Error(t, err)
}
func TestIssue851(t *testing.T) {
type Target struct {
Params map[string]string `toml:"params"`
}
content := "params = {a=\"1\",b=\"2\"}"
var target Target
err := toml.Unmarshal([]byte(content), &target)
require.NoError(t, err)
require.Equal(t, map[string]string{"a": "1", "b": "2"}, target.Params)
err = toml.Unmarshal([]byte(content), &target)
require.NoError(t, err)
require.Equal(t, map[string]string{"a": "1", "b": "2"}, target.Params)
}
func TestUnmarshalDecodeErrors(t *testing.T) { func TestUnmarshalDecodeErrors(t *testing.T) {
examples := []struct { examples := []struct {
desc string desc string
@@ -2656,7 +2763,7 @@ world'`,
data: "a = \"aaaa\xE2\x80\x00\"", data: "a = \"aaaa\xE2\x80\x00\"",
}, },
{ {
desc: "invalid 4rd byte of 4-byte utf8 character in string with no escape sequence", desc: "invalid 4th byte of 4-byte utf8 character in string with no escape sequence",
data: "a = \"aaaa\xF2\x81\x81\x00\"", data: "a = \"aaaa\xF2\x81\x81\x00\"",
}, },
{ {
@@ -2672,7 +2779,7 @@ world'`,
data: "a = 'aaaa\xE2\x80\x00'", data: "a = 'aaaa\xE2\x80\x00'",
}, },
{ {
desc: "invalid 4rd byte of 4-byte utf8 character in literal string", desc: "invalid 4th byte of 4-byte utf8 character in literal string",
data: "a = 'aaaa\xF2\x81\x81\x00'", data: "a = 'aaaa\xF2\x81\x81\x00'",
}, },
{ {
@@ -2831,6 +2938,36 @@ world'`,
} }
} }
func TestOmitEmpty(t *testing.T) {
type inner struct {
private string
Skip string `toml:"-"`
V string
}
type elem struct {
Foo string `toml:",omitempty"`
Bar string `toml:",omitempty"`
Inner inner `toml:",omitempty"`
}
type doc struct {
X []elem `toml:",inline"`
}
d := doc{X: []elem{elem{
Foo: "test",
Inner: inner{
V: "alue",
},
}}}
b, err := toml.Marshal(d)
require.NoError(t, err)
require.Equal(t, "X = [{Foo = 'test', Inner = {V = 'alue'}}]\n", string(b))
}
func TestUnmarshalTags(t *testing.T) { func TestUnmarshalTags(t *testing.T) {
type doc struct { type doc struct {
Dash string `toml:"-,"` Dash string `toml:"-,"`
@@ -3172,3 +3309,16 @@ func TestUnmarshal_RecursiveTableArray(t *testing.T) {
}) })
} }
} }
func TestUnmarshalEmbedNonString(t *testing.T) {
type Foo []byte
type doc struct {
Foo
}
d := doc{}
err := toml.Unmarshal([]byte(`foo = 'bar'`), &d)
require.NoError(t, err)
require.Nil(t, d.Foo)
}
+33 -41
View File
@@ -1,4 +1,4 @@
package ast package unstable
import ( import (
"fmt" "fmt"
@@ -7,14 +7,17 @@ import (
"github.com/pelletier/go-toml/v2/internal/danger" "github.com/pelletier/go-toml/v2/internal/danger"
) )
// Iterator starts uninitialized, you need to call Next() first. // Iterator over a sequence of nodes.
//
// Starts uninitialized, you need to call Next() first.
// //
// For example: // For example:
// //
// it := n.Children() // it := n.Children()
// for it.Next() { // for it.Next() {
// it.Node() // n := it.Node()
// } // // do something with n
// }
type Iterator struct { type Iterator struct {
started bool started bool
node *Node node *Node
@@ -32,42 +35,31 @@ func (c *Iterator) Next() bool {
} }
// IsLast returns true if the current node of the iterator is the last // IsLast returns true if the current node of the iterator is the last
// one. Subsequent call to Next() will return false. // one. Subsequent calls to Next() will return false.
func (c *Iterator) IsLast() bool { func (c *Iterator) IsLast() bool {
return c.node.next == 0 return c.node.next == 0
} }
// Node returns a copy of the node pointed at by the iterator. // Node returns a pointer to the node pointed at by the iterator.
func (c *Iterator) Node() *Node { func (c *Iterator) Node() *Node {
return c.node return c.node
} }
// Root contains a full AST. // Node in a TOML expression AST.
// //
// It is immutable once constructed with Builder. // Depending on Kind, its sequence of children should be interpreted
type Root struct { // differently.
nodes []Node //
} // - Array have one child per element in the array.
// - InlineTable have one child per key-value in the table (each of kind
// Iterator over the top level nodes. // InlineTable).
func (r *Root) Iterator() Iterator { // - KeyValue have at least two children. The first one is the value. The rest
it := Iterator{} // make a potentially dotted key.
if len(r.nodes) > 0 { // - Table and ArrayTable's children represent a dotted key (same as
it.node = &r.nodes[0] // KeyValue, but without the first node being the value).
} //
return it // When relevant, Raw describes the range of bytes this node is referring to in
} // the input document. Use Parser.Raw() to retrieve the actual bytes.
func (r *Root) at(idx Reference) *Node {
return &r.nodes[idx]
}
// Arrays have one child per element in the array. InlineTables have
// one child per key-value pair in the table. KeyValues have at least
// two children. The first one is the value. The rest make a
// potentially dotted key. Table and Array table have one child per
// element of the key they represent (same as KeyValue, but without
// the last node being the value).
type Node struct { type Node struct {
Kind Kind Kind Kind
Raw Range // Raw bytes from the input. Raw Range // Raw bytes from the input.
@@ -80,13 +72,13 @@ type Node struct {
child int // 0 if no child child int // 0 if no child
} }
// Range of bytes in the document.
type Range struct { type Range struct {
Offset uint32 Offset uint32
Length uint32 Length uint32
} }
// Next returns a copy of the next node, or an invalid Node if there // Next returns a pointer to the next node, or nil if there is no next node.
// is no next node.
func (n *Node) Next() *Node { func (n *Node) Next() *Node {
if n.next == 0 { if n.next == 0 {
return nil return nil
@@ -96,9 +88,9 @@ func (n *Node) Next() *Node {
return (*Node)(danger.Stride(ptr, size, n.next)) return (*Node)(danger.Stride(ptr, size, n.next))
} }
// Child returns a copy of the first child node of this node. Other // Child returns a pointer to the first child node of this node. Other children
// children can be accessed calling Next on the first child. Returns // can be accessed calling Next on the first child. Returns an nil if this Node
// an invalid Node if there is none. // has no child.
func (n *Node) Child() *Node { func (n *Node) Child() *Node {
if n.child == 0 { if n.child == 0 {
return nil return nil
@@ -113,9 +105,9 @@ func (n *Node) Valid() bool {
return n != nil return n != nil
} }
// Key returns the child nodes making the Key on a supported // Key returns the children nodes making the Key on a supported node. Panics
// node. Panics otherwise. They are guaranteed to be all be of the // otherwise. They are guaranteed to be all be of the Kind Key. A simple key
// Kind Key. A simple key would return just one element. // would return just one element.
func (n *Node) Key() Iterator { func (n *Node) Key() Iterator {
switch n.Kind { switch n.Kind {
case KeyValue: case KeyValue:
@@ -1,4 +1,4 @@
package toml package unstable
import ( import (
"bytes" "bytes"
@@ -55,7 +55,7 @@ func BenchmarkParseLiteralStringValid(b *testing.B) {
for name, input := range inputs { for name, input := range inputs {
b.Run(name, func(b *testing.B) { b.Run(name, func(b *testing.B) {
p := parser{} p := Parser{}
b.SetBytes(int64(len(input))) b.SetBytes(int64(len(input)))
b.ReportAllocs() b.ReportAllocs()
b.ResetTimer() b.ResetTimer()
+71
View File
@@ -0,0 +1,71 @@
package unstable
// root contains a full AST.
//
// It is immutable once constructed with Builder.
type root struct {
nodes []Node
}
// Iterator over the top level nodes.
func (r *root) Iterator() Iterator {
it := Iterator{}
if len(r.nodes) > 0 {
it.node = &r.nodes[0]
}
return it
}
func (r *root) at(idx reference) *Node {
return &r.nodes[idx]
}
type reference int
const invalidReference reference = -1
func (r reference) Valid() bool {
return r != invalidReference
}
type builder struct {
tree root
lastIdx int
}
func (b *builder) Tree() *root {
return &b.tree
}
func (b *builder) NodeAt(ref reference) *Node {
return b.tree.at(ref)
}
func (b *builder) Reset() {
b.tree.nodes = b.tree.nodes[:0]
b.lastIdx = 0
}
func (b *builder) Push(n Node) reference {
b.lastIdx = len(b.tree.nodes)
b.tree.nodes = append(b.tree.nodes, n)
return reference(b.lastIdx)
}
func (b *builder) PushAndChain(n Node) reference {
newIdx := len(b.tree.nodes)
b.tree.nodes = append(b.tree.nodes, n)
if b.lastIdx >= 0 {
b.tree.nodes[b.lastIdx].next = newIdx - b.lastIdx
}
b.lastIdx = newIdx
return reference(b.lastIdx)
}
func (b *builder) AttachChild(parent reference, child reference) {
b.tree.nodes[parent].child = int(child) - int(parent)
}
func (b *builder) Chain(from reference, to reference) {
b.tree.nodes[from].next = int(to) - int(from)
}
+3
View File
@@ -0,0 +1,3 @@
// Package unstable provides APIs that do not meet the backward compatibility
// guarantees yet.
package unstable
+7 -5
View File
@@ -1,25 +1,26 @@
package ast package unstable
import "fmt" import "fmt"
// Kind represents the type of TOML structure contained in a given Node.
type Kind int type Kind int
const ( const (
// meta // Meta
Invalid Kind = iota Invalid Kind = iota
Comment Comment
Key Key
// top level structures // Top level structures
Table Table
ArrayTable ArrayTable
KeyValue KeyValue
// containers values // Containers values
Array Array
InlineTable InlineTable
// values // Values
String String
Bool Bool
Float Float
@@ -30,6 +31,7 @@ const (
DateTime DateTime
) )
// String implementation of fmt.Stringer.
func (k Kind) String() string { func (k Kind) String() string {
switch k { switch k {
case Invalid: case Invalid:
+178 -116
View File
@@ -1,50 +1,108 @@
package toml package unstable
import ( import (
"bytes" "bytes"
"fmt"
"unicode" "unicode"
"github.com/pelletier/go-toml/v2/internal/ast" "github.com/pelletier/go-toml/v2/internal/characters"
"github.com/pelletier/go-toml/v2/internal/danger" "github.com/pelletier/go-toml/v2/internal/danger"
) )
type parser struct { // ParserError describes an error relative to the content of the document.
builder ast.Builder //
ref ast.Reference // It cannot outlive the instance of Parser it refers to, and may cause panics
// if the parser is reset.
type ParserError struct {
Highlight []byte
Message string
Key []string // optional
}
// Error is the implementation of the error interface.
func (e *ParserError) Error() string {
return e.Message
}
// NewParserError is a convenience function to create a ParserError
//
// Warning: Highlight needs to be a subslice of Parser.data, so only slices
// returned by Parser.Raw are valid candidates.
func NewParserError(highlight []byte, format string, args ...interface{}) error {
return &ParserError{
Highlight: highlight,
Message: fmt.Errorf(format, args...).Error(),
}
}
// Parser scans over a TOML-encoded document and generates an iterative AST.
//
// To prime the Parser, first reset it with the contents of a TOML document.
// Then, process all top-level expressions sequentially. See Example.
//
// Don't forget to check Error() after you're done parsing.
//
// Each top-level expression needs to be fully processed before calling
// NextExpression() again. Otherwise, calls to various Node methods may panic if
// the parser has moved on the next expression.
//
// For performance reasons, go-toml doesn't make a copy of the input bytes to
// the parser. Make sure to copy all the bytes you need to outlive the slice
// given to the parser.
//
// The parser doesn't provide nodes for comments yet, nor for whitespace.
type Parser struct {
data []byte data []byte
builder builder
ref reference
left []byte left []byte
err error err error
first bool first bool
} }
func (p *parser) Range(b []byte) ast.Range { // Data returns the slice provided to the last call to Reset.
return ast.Range{ func (p *Parser) Data() []byte {
return p.data
}
// Range returns a range description that corresponds to a given slice of the
// input. If the argument is not a subslice of the parser input, this function
// panics.
func (p *Parser) Range(b []byte) Range {
return Range{
Offset: uint32(danger.SubsliceOffset(p.data, b)), Offset: uint32(danger.SubsliceOffset(p.data, b)),
Length: uint32(len(b)), Length: uint32(len(b)),
} }
} }
func (p *parser) Raw(raw ast.Range) []byte { // Raw returns the slice corresponding to the bytes in the given range.
func (p *Parser) Raw(raw Range) []byte {
return p.data[raw.Offset : raw.Offset+raw.Length] return p.data[raw.Offset : raw.Offset+raw.Length]
} }
func (p *parser) Reset(b []byte) { // Reset brings the parser to its initial state for a given input. It wipes an
// reuses internal storage to reduce allocation.
func (p *Parser) Reset(b []byte) {
p.builder.Reset() p.builder.Reset()
p.ref = ast.InvalidReference p.ref = invalidReference
p.data = b p.data = b
p.left = b p.left = b
p.err = nil p.err = nil
p.first = true p.first = true
} }
//nolint:cyclop // NextExpression parses the next top-level expression. If an expression was
func (p *parser) NextExpression() bool { // successfully parsed, it returns true. If the parser is at the end of the
// document or an error occurred, it returns false.
//
// Retrieve the parsed expression with Expression().
func (p *Parser) NextExpression() bool {
if len(p.left) == 0 || p.err != nil { if len(p.left) == 0 || p.err != nil {
return false return false
} }
p.builder.Reset() p.builder.Reset()
p.ref = ast.InvalidReference p.ref = invalidReference
for { for {
if len(p.left) == 0 || p.err != nil { if len(p.left) == 0 || p.err != nil {
@@ -73,15 +131,18 @@ func (p *parser) NextExpression() bool {
} }
} }
func (p *parser) Expression() *ast.Node { // Expression returns a pointer to the node representing the last successfully
// parsed expression.
func (p *Parser) Expression() *Node {
return p.builder.NodeAt(p.ref) return p.builder.NodeAt(p.ref)
} }
func (p *parser) Error() error { // Error returns any error that has occurred during parsing.
func (p *Parser) Error() error {
return p.err return p.err
} }
func (p *parser) parseNewline(b []byte) ([]byte, error) { func (p *Parser) parseNewline(b []byte) ([]byte, error) {
if b[0] == '\n' { if b[0] == '\n' {
return b[1:], nil return b[1:], nil
} }
@@ -91,14 +152,14 @@ func (p *parser) parseNewline(b []byte) ([]byte, error) {
return rest, err return rest, err
} }
return nil, newDecodeError(b[0:1], "expected newline but got %#U", b[0]) return nil, NewParserError(b[0:1], "expected newline but got %#U", b[0])
} }
func (p *parser) parseExpression(b []byte) (ast.Reference, []byte, error) { func (p *Parser) parseExpression(b []byte) (reference, []byte, error) {
// expression = ws [ comment ] // expression = ws [ comment ]
// expression =/ ws keyval ws [ comment ] // expression =/ ws keyval ws [ comment ]
// expression =/ ws table ws [ comment ] // expression =/ ws table ws [ comment ]
ref := ast.InvalidReference ref := invalidReference
b = p.parseWhitespace(b) b = p.parseWhitespace(b)
@@ -136,7 +197,7 @@ func (p *parser) parseExpression(b []byte) (ast.Reference, []byte, error) {
return ref, b, nil return ref, b, nil
} }
func (p *parser) parseTable(b []byte) (ast.Reference, []byte, error) { func (p *Parser) parseTable(b []byte) (reference, []byte, error) {
// table = std-table / array-table // table = std-table / array-table
if len(b) > 1 && b[1] == '[' { if len(b) > 1 && b[1] == '[' {
return p.parseArrayTable(b) return p.parseArrayTable(b)
@@ -145,12 +206,12 @@ func (p *parser) parseTable(b []byte) (ast.Reference, []byte, error) {
return p.parseStdTable(b) return p.parseStdTable(b)
} }
func (p *parser) parseArrayTable(b []byte) (ast.Reference, []byte, error) { func (p *Parser) parseArrayTable(b []byte) (reference, []byte, error) {
// array-table = array-table-open key array-table-close // array-table = array-table-open key array-table-close
// array-table-open = %x5B.5B ws ; [[ Double left square bracket // array-table-open = %x5B.5B ws ; [[ Double left square bracket
// array-table-close = ws %x5D.5D ; ]] Double right square bracket // array-table-close = ws %x5D.5D ; ]] Double right square bracket
ref := p.builder.Push(ast.Node{ ref := p.builder.Push(Node{
Kind: ast.ArrayTable, Kind: ArrayTable,
}) })
b = b[2:] b = b[2:]
@@ -174,12 +235,12 @@ func (p *parser) parseArrayTable(b []byte) (ast.Reference, []byte, error) {
return ref, b, err return ref, b, err
} }
func (p *parser) parseStdTable(b []byte) (ast.Reference, []byte, error) { func (p *Parser) parseStdTable(b []byte) (reference, []byte, error) {
// std-table = std-table-open key std-table-close // std-table = std-table-open key std-table-close
// std-table-open = %x5B ws ; [ Left square bracket // std-table-open = %x5B ws ; [ Left square bracket
// std-table-close = ws %x5D ; ] Right square bracket // std-table-close = ws %x5D ; ] Right square bracket
ref := p.builder.Push(ast.Node{ ref := p.builder.Push(Node{
Kind: ast.Table, Kind: Table,
}) })
b = b[1:] b = b[1:]
@@ -199,15 +260,15 @@ func (p *parser) parseStdTable(b []byte) (ast.Reference, []byte, error) {
return ref, b, err return ref, b, err
} }
func (p *parser) parseKeyval(b []byte) (ast.Reference, []byte, error) { func (p *Parser) parseKeyval(b []byte) (reference, []byte, error) {
// keyval = key keyval-sep val // keyval = key keyval-sep val
ref := p.builder.Push(ast.Node{ ref := p.builder.Push(Node{
Kind: ast.KeyValue, Kind: KeyValue,
}) })
key, b, err := p.parseKey(b) key, b, err := p.parseKey(b)
if err != nil { if err != nil {
return ast.InvalidReference, nil, err return invalidReference, nil, err
} }
// keyval-sep = ws %x3D ws ; = // keyval-sep = ws %x3D ws ; =
@@ -215,12 +276,12 @@ func (p *parser) parseKeyval(b []byte) (ast.Reference, []byte, error) {
b = p.parseWhitespace(b) b = p.parseWhitespace(b)
if len(b) == 0 { if len(b) == 0 {
return ast.InvalidReference, nil, newDecodeError(b, "expected = after a key, but the document ends there") return invalidReference, nil, NewParserError(b, "expected = after a key, but the document ends there")
} }
b, err = expect('=', b) b, err = expect('=', b)
if err != nil { if err != nil {
return ast.InvalidReference, nil, err return invalidReference, nil, err
} }
b = p.parseWhitespace(b) b = p.parseWhitespace(b)
@@ -237,12 +298,12 @@ func (p *parser) parseKeyval(b []byte) (ast.Reference, []byte, error) {
} }
//nolint:cyclop,funlen //nolint:cyclop,funlen
func (p *parser) parseVal(b []byte) (ast.Reference, []byte, error) { func (p *Parser) parseVal(b []byte) (reference, []byte, error) {
// val = string / boolean / array / inline-table / date-time / float / integer // val = string / boolean / array / inline-table / date-time / float / integer
ref := ast.InvalidReference ref := invalidReference
if len(b) == 0 { if len(b) == 0 {
return ref, nil, newDecodeError(b, "expected value, not eof") return ref, nil, NewParserError(b, "expected value, not eof")
} }
var err error var err error
@@ -259,8 +320,8 @@ func (p *parser) parseVal(b []byte) (ast.Reference, []byte, error) {
} }
if err == nil { if err == nil {
ref = p.builder.Push(ast.Node{ ref = p.builder.Push(Node{
Kind: ast.String, Kind: String,
Raw: p.Range(raw), Raw: p.Range(raw),
Data: v, Data: v,
}) })
@@ -277,8 +338,8 @@ func (p *parser) parseVal(b []byte) (ast.Reference, []byte, error) {
} }
if err == nil { if err == nil {
ref = p.builder.Push(ast.Node{ ref = p.builder.Push(Node{
Kind: ast.String, Kind: String,
Raw: p.Range(raw), Raw: p.Range(raw),
Data: v, Data: v,
}) })
@@ -287,22 +348,22 @@ func (p *parser) parseVal(b []byte) (ast.Reference, []byte, error) {
return ref, b, err return ref, b, err
case 't': case 't':
if !scanFollowsTrue(b) { if !scanFollowsTrue(b) {
return ref, nil, newDecodeError(atmost(b, 4), "expected 'true'") return ref, nil, NewParserError(atmost(b, 4), "expected 'true'")
} }
ref = p.builder.Push(ast.Node{ ref = p.builder.Push(Node{
Kind: ast.Bool, Kind: Bool,
Data: b[:4], Data: b[:4],
}) })
return ref, b[4:], nil return ref, b[4:], nil
case 'f': case 'f':
if !scanFollowsFalse(b) { if !scanFollowsFalse(b) {
return ref, nil, newDecodeError(atmost(b, 5), "expected 'false'") return ref, nil, NewParserError(atmost(b, 5), "expected 'false'")
} }
ref = p.builder.Push(ast.Node{ ref = p.builder.Push(Node{
Kind: ast.Bool, Kind: Bool,
Data: b[:5], Data: b[:5],
}) })
@@ -324,7 +385,7 @@ func atmost(b []byte, n int) []byte {
return b[:n] return b[:n]
} }
func (p *parser) parseLiteralString(b []byte) ([]byte, []byte, []byte, error) { func (p *Parser) parseLiteralString(b []byte) ([]byte, []byte, []byte, error) {
v, rest, err := scanLiteralString(b) v, rest, err := scanLiteralString(b)
if err != nil { if err != nil {
return nil, nil, nil, err return nil, nil, nil, err
@@ -333,19 +394,20 @@ func (p *parser) parseLiteralString(b []byte) ([]byte, []byte, []byte, error) {
return v, v[1 : len(v)-1], rest, nil return v, v[1 : len(v)-1], rest, nil
} }
func (p *parser) parseInlineTable(b []byte) (ast.Reference, []byte, error) { func (p *Parser) parseInlineTable(b []byte) (reference, []byte, error) {
// inline-table = inline-table-open [ inline-table-keyvals ] inline-table-close // inline-table = inline-table-open [ inline-table-keyvals ] inline-table-close
// inline-table-open = %x7B ws ; { // inline-table-open = %x7B ws ; {
// inline-table-close = ws %x7D ; } // inline-table-close = ws %x7D ; }
// inline-table-sep = ws %x2C ws ; , Comma // inline-table-sep = ws %x2C ws ; , Comma
// inline-table-keyvals = keyval [ inline-table-sep inline-table-keyvals ] // inline-table-keyvals = keyval [ inline-table-sep inline-table-keyvals ]
parent := p.builder.Push(ast.Node{ parent := p.builder.Push(Node{
Kind: ast.InlineTable, Kind: InlineTable,
Raw: p.Range(b[:1]),
}) })
first := true first := true
var child ast.Reference var child reference
b = b[1:] b = b[1:]
@@ -356,7 +418,7 @@ func (p *parser) parseInlineTable(b []byte) (ast.Reference, []byte, error) {
b = p.parseWhitespace(b) b = p.parseWhitespace(b)
if len(b) == 0 { if len(b) == 0 {
return parent, nil, newDecodeError(previousB[:1], "inline table is incomplete") return parent, nil, NewParserError(previousB[:1], "inline table is incomplete")
} }
if b[0] == '}' { if b[0] == '}' {
@@ -371,7 +433,7 @@ func (p *parser) parseInlineTable(b []byte) (ast.Reference, []byte, error) {
b = p.parseWhitespace(b) b = p.parseWhitespace(b)
} }
var kv ast.Reference var kv reference
kv, b, err = p.parseKeyval(b) kv, b, err = p.parseKeyval(b)
if err != nil { if err != nil {
@@ -394,7 +456,7 @@ func (p *parser) parseInlineTable(b []byte) (ast.Reference, []byte, error) {
} }
//nolint:funlen,cyclop //nolint:funlen,cyclop
func (p *parser) parseValArray(b []byte) (ast.Reference, []byte, error) { func (p *Parser) parseValArray(b []byte) (reference, []byte, error) {
// array = array-open [ array-values ] ws-comment-newline array-close // array = array-open [ array-values ] ws-comment-newline array-close
// array-open = %x5B ; [ // array-open = %x5B ; [
// array-close = %x5D ; ] // array-close = %x5D ; ]
@@ -405,13 +467,13 @@ func (p *parser) parseValArray(b []byte) (ast.Reference, []byte, error) {
arrayStart := b arrayStart := b
b = b[1:] b = b[1:]
parent := p.builder.Push(ast.Node{ parent := p.builder.Push(Node{
Kind: ast.Array, Kind: Array,
}) })
first := true first := true
var lastChild ast.Reference var lastChild reference
var err error var err error
for len(b) > 0 { for len(b) > 0 {
@@ -421,7 +483,7 @@ func (p *parser) parseValArray(b []byte) (ast.Reference, []byte, error) {
} }
if len(b) == 0 { if len(b) == 0 {
return parent, nil, newDecodeError(arrayStart[:1], "array is incomplete") return parent, nil, NewParserError(arrayStart[:1], "array is incomplete")
} }
if b[0] == ']' { if b[0] == ']' {
@@ -430,7 +492,7 @@ func (p *parser) parseValArray(b []byte) (ast.Reference, []byte, error) {
if b[0] == ',' { if b[0] == ',' {
if first { if first {
return parent, nil, newDecodeError(b[0:1], "array cannot start with comma") return parent, nil, NewParserError(b[0:1], "array cannot start with comma")
} }
b = b[1:] b = b[1:]
@@ -439,7 +501,7 @@ func (p *parser) parseValArray(b []byte) (ast.Reference, []byte, error) {
return parent, nil, err return parent, nil, err
} }
} else if !first { } else if !first {
return parent, nil, newDecodeError(b[0:1], "array elements must be separated by commas") return parent, nil, NewParserError(b[0:1], "array elements must be separated by commas")
} }
// TOML allows trailing commas in arrays. // TOML allows trailing commas in arrays.
@@ -447,7 +509,7 @@ func (p *parser) parseValArray(b []byte) (ast.Reference, []byte, error) {
break break
} }
var valueRef ast.Reference var valueRef reference
valueRef, b, err = p.parseVal(b) valueRef, b, err = p.parseVal(b)
if err != nil { if err != nil {
return parent, nil, err return parent, nil, err
@@ -472,7 +534,7 @@ func (p *parser) parseValArray(b []byte) (ast.Reference, []byte, error) {
return parent, rest, err return parent, rest, err
} }
func (p *parser) parseOptionalWhitespaceCommentNewline(b []byte) ([]byte, error) { func (p *Parser) parseOptionalWhitespaceCommentNewline(b []byte) ([]byte, error) {
for len(b) > 0 { for len(b) > 0 {
var err error var err error
b = p.parseWhitespace(b) b = p.parseWhitespace(b)
@@ -501,7 +563,7 @@ func (p *parser) parseOptionalWhitespaceCommentNewline(b []byte) ([]byte, error)
return b, nil return b, nil
} }
func (p *parser) parseMultilineLiteralString(b []byte) ([]byte, []byte, []byte, error) { func (p *Parser) parseMultilineLiteralString(b []byte) ([]byte, []byte, []byte, error) {
token, rest, err := scanMultilineLiteralString(b) token, rest, err := scanMultilineLiteralString(b)
if err != nil { if err != nil {
return nil, nil, nil, err return nil, nil, nil, err
@@ -520,7 +582,7 @@ func (p *parser) parseMultilineLiteralString(b []byte) ([]byte, []byte, []byte,
} }
//nolint:funlen,gocognit,cyclop //nolint:funlen,gocognit,cyclop
func (p *parser) parseMultilineBasicString(b []byte) ([]byte, []byte, []byte, error) { func (p *Parser) parseMultilineBasicString(b []byte) ([]byte, []byte, []byte, error) {
// ml-basic-string = ml-basic-string-delim [ newline ] ml-basic-body // ml-basic-string = ml-basic-string-delim [ newline ] ml-basic-body
// ml-basic-string-delim // ml-basic-string-delim
// ml-basic-string-delim = 3quotation-mark // ml-basic-string-delim = 3quotation-mark
@@ -551,11 +613,11 @@ func (p *parser) parseMultilineBasicString(b []byte) ([]byte, []byte, []byte, er
if !escaped { if !escaped {
str := token[startIdx:endIdx] str := token[startIdx:endIdx]
verr := utf8TomlValidAlreadyEscaped(str) verr := characters.Utf8TomlValidAlreadyEscaped(str)
if verr.Zero() { if verr.Zero() {
return token, str, rest, nil return token, str, rest, nil
} }
return nil, nil, nil, newDecodeError(str[verr.Index:verr.Index+verr.Size], "invalid UTF-8") return nil, nil, nil, NewParserError(str[verr.Index:verr.Index+verr.Size], "invalid UTF-8")
} }
var builder bytes.Buffer var builder bytes.Buffer
@@ -635,13 +697,13 @@ func (p *parser) parseMultilineBasicString(b []byte) ([]byte, []byte, []byte, er
builder.WriteRune(x) builder.WriteRune(x)
i += 8 i += 8
default: default:
return nil, nil, nil, newDecodeError(token[i:i+1], "invalid escaped character %#U", c) return nil, nil, nil, NewParserError(token[i:i+1], "invalid escaped character %#U", c)
} }
i++ i++
} else { } else {
size := utf8ValidNext(token[i:]) size := characters.Utf8ValidNext(token[i:])
if size == 0 { if size == 0 {
return nil, nil, nil, newDecodeError(token[i:i+1], "invalid character %#U", c) return nil, nil, nil, NewParserError(token[i:i+1], "invalid character %#U", c)
} }
builder.Write(token[i : i+size]) builder.Write(token[i : i+size])
i += size i += size
@@ -651,7 +713,7 @@ func (p *parser) parseMultilineBasicString(b []byte) ([]byte, []byte, []byte, er
return token, builder.Bytes(), rest, nil return token, builder.Bytes(), rest, nil
} }
func (p *parser) parseKey(b []byte) (ast.Reference, []byte, error) { func (p *Parser) parseKey(b []byte) (reference, []byte, error) {
// key = simple-key / dotted-key // key = simple-key / dotted-key
// simple-key = quoted-key / unquoted-key // simple-key = quoted-key / unquoted-key
// //
@@ -662,11 +724,11 @@ func (p *parser) parseKey(b []byte) (ast.Reference, []byte, error) {
// dot-sep = ws %x2E ws ; . Period // dot-sep = ws %x2E ws ; . Period
raw, key, b, err := p.parseSimpleKey(b) raw, key, b, err := p.parseSimpleKey(b)
if err != nil { if err != nil {
return ast.InvalidReference, nil, err return invalidReference, nil, err
} }
ref := p.builder.Push(ast.Node{ ref := p.builder.Push(Node{
Kind: ast.Key, Kind: Key,
Raw: p.Range(raw), Raw: p.Range(raw),
Data: key, Data: key,
}) })
@@ -681,8 +743,8 @@ func (p *parser) parseKey(b []byte) (ast.Reference, []byte, error) {
return ref, nil, err return ref, nil, err
} }
p.builder.PushAndChain(ast.Node{ p.builder.PushAndChain(Node{
Kind: ast.Key, Kind: Key,
Raw: p.Range(raw), Raw: p.Range(raw),
Data: key, Data: key,
}) })
@@ -694,9 +756,9 @@ func (p *parser) parseKey(b []byte) (ast.Reference, []byte, error) {
return ref, b, nil return ref, b, nil
} }
func (p *parser) parseSimpleKey(b []byte) (raw, key, rest []byte, err error) { func (p *Parser) parseSimpleKey(b []byte) (raw, key, rest []byte, err error) {
if len(b) == 0 { if len(b) == 0 {
return nil, nil, nil, newDecodeError(b, "expected key but found none") return nil, nil, nil, NewParserError(b, "expected key but found none")
} }
// simple-key = quoted-key / unquoted-key // simple-key = quoted-key / unquoted-key
@@ -711,12 +773,12 @@ func (p *parser) parseSimpleKey(b []byte) (raw, key, rest []byte, err error) {
key, rest = scanUnquotedKey(b) key, rest = scanUnquotedKey(b)
return key, key, rest, nil return key, key, rest, nil
default: default:
return nil, nil, nil, newDecodeError(b[0:1], "invalid character at start of key: %c", b[0]) return nil, nil, nil, NewParserError(b[0:1], "invalid character at start of key: %c", b[0])
} }
} }
//nolint:funlen,cyclop //nolint:funlen,cyclop
func (p *parser) parseBasicString(b []byte) ([]byte, []byte, []byte, error) { func (p *Parser) parseBasicString(b []byte) ([]byte, []byte, []byte, error) {
// basic-string = quotation-mark *basic-char quotation-mark // basic-string = quotation-mark *basic-char quotation-mark
// quotation-mark = %x22 ; " // quotation-mark = %x22 ; "
// basic-char = basic-unescaped / escaped // basic-char = basic-unescaped / escaped
@@ -744,11 +806,11 @@ func (p *parser) parseBasicString(b []byte) ([]byte, []byte, []byte, error) {
// validate the string and return a direct reference to the buffer. // validate the string and return a direct reference to the buffer.
if !escaped { if !escaped {
str := token[startIdx:endIdx] str := token[startIdx:endIdx]
verr := utf8TomlValidAlreadyEscaped(str) verr := characters.Utf8TomlValidAlreadyEscaped(str)
if verr.Zero() { if verr.Zero() {
return token, str, rest, nil return token, str, rest, nil
} }
return nil, nil, nil, newDecodeError(str[verr.Index:verr.Index+verr.Size], "invalid UTF-8") return nil, nil, nil, NewParserError(str[verr.Index:verr.Index+verr.Size], "invalid UTF-8")
} }
i := startIdx i := startIdx
@@ -795,13 +857,13 @@ func (p *parser) parseBasicString(b []byte) ([]byte, []byte, []byte, error) {
builder.WriteRune(x) builder.WriteRune(x)
i += 8 i += 8
default: default:
return nil, nil, nil, newDecodeError(token[i:i+1], "invalid escaped character %#U", c) return nil, nil, nil, NewParserError(token[i:i+1], "invalid escaped character %#U", c)
} }
i++ i++
} else { } else {
size := utf8ValidNext(token[i:]) size := characters.Utf8ValidNext(token[i:])
if size == 0 { if size == 0 {
return nil, nil, nil, newDecodeError(token[i:i+1], "invalid character %#U", c) return nil, nil, nil, NewParserError(token[i:i+1], "invalid character %#U", c)
} }
builder.Write(token[i : i+size]) builder.Write(token[i : i+size])
i += size i += size
@@ -813,7 +875,7 @@ func (p *parser) parseBasicString(b []byte) ([]byte, []byte, []byte, error) {
func hexToRune(b []byte, length int) (rune, error) { func hexToRune(b []byte, length int) (rune, error) {
if len(b) < length { if len(b) < length {
return -1, newDecodeError(b, "unicode point needs %d character, not %d", length, len(b)) return -1, NewParserError(b, "unicode point needs %d character, not %d", length, len(b))
} }
b = b[:length] b = b[:length]
@@ -828,19 +890,19 @@ func hexToRune(b []byte, length int) (rune, error) {
case 'A' <= c && c <= 'F': case 'A' <= c && c <= 'F':
d = uint32(c - 'A' + 10) d = uint32(c - 'A' + 10)
default: default:
return -1, newDecodeError(b[i:i+1], "non-hex character") return -1, NewParserError(b[i:i+1], "non-hex character")
} }
r = r*16 + d r = r*16 + d
} }
if r > unicode.MaxRune || 0xD800 <= r && r < 0xE000 { if r > unicode.MaxRune || 0xD800 <= r && r < 0xE000 {
return -1, newDecodeError(b, "escape sequence is invalid Unicode code point") return -1, NewParserError(b, "escape sequence is invalid Unicode code point")
} }
return rune(r), nil return rune(r), nil
} }
func (p *parser) parseWhitespace(b []byte) []byte { func (p *Parser) parseWhitespace(b []byte) []byte {
// ws = *wschar // ws = *wschar
// wschar = %x20 ; Space // wschar = %x20 ; Space
// wschar =/ %x09 ; Horizontal tab // wschar =/ %x09 ; Horizontal tab
@@ -850,24 +912,24 @@ func (p *parser) parseWhitespace(b []byte) []byte {
} }
//nolint:cyclop //nolint:cyclop
func (p *parser) parseIntOrFloatOrDateTime(b []byte) (ast.Reference, []byte, error) { func (p *Parser) parseIntOrFloatOrDateTime(b []byte) (reference, []byte, error) {
switch b[0] { switch b[0] {
case 'i': case 'i':
if !scanFollowsInf(b) { if !scanFollowsInf(b) {
return ast.InvalidReference, nil, newDecodeError(atmost(b, 3), "expected 'inf'") return invalidReference, nil, NewParserError(atmost(b, 3), "expected 'inf'")
} }
return p.builder.Push(ast.Node{ return p.builder.Push(Node{
Kind: ast.Float, Kind: Float,
Data: b[:3], Data: b[:3],
}), b[3:], nil }), b[3:], nil
case 'n': case 'n':
if !scanFollowsNan(b) { if !scanFollowsNan(b) {
return ast.InvalidReference, nil, newDecodeError(atmost(b, 3), "expected 'nan'") return invalidReference, nil, NewParserError(atmost(b, 3), "expected 'nan'")
} }
return p.builder.Push(ast.Node{ return p.builder.Push(Node{
Kind: ast.Float, Kind: Float,
Data: b[:3], Data: b[:3],
}), b[3:], nil }), b[3:], nil
case '+', '-': case '+', '-':
@@ -898,7 +960,7 @@ func (p *parser) parseIntOrFloatOrDateTime(b []byte) (ast.Reference, []byte, err
return p.scanIntOrFloat(b) return p.scanIntOrFloat(b)
} }
func (p *parser) scanDateTime(b []byte) (ast.Reference, []byte, error) { func (p *Parser) scanDateTime(b []byte) (reference, []byte, error) {
// scans for contiguous characters in [0-9T:Z.+-], and up to one space if // scans for contiguous characters in [0-9T:Z.+-], and up to one space if
// followed by a digit. // followed by a digit.
hasDate := false hasDate := false
@@ -941,30 +1003,30 @@ byteLoop:
} }
} }
var kind ast.Kind var kind Kind
if hasTime { if hasTime {
if hasDate { if hasDate {
if hasTz { if hasTz {
kind = ast.DateTime kind = DateTime
} else { } else {
kind = ast.LocalDateTime kind = LocalDateTime
} }
} else { } else {
kind = ast.LocalTime kind = LocalTime
} }
} else { } else {
kind = ast.LocalDate kind = LocalDate
} }
return p.builder.Push(ast.Node{ return p.builder.Push(Node{
Kind: kind, Kind: kind,
Data: b[:i], Data: b[:i],
}), b[i:], nil }), b[i:], nil
} }
//nolint:funlen,gocognit,cyclop //nolint:funlen,gocognit,cyclop
func (p *parser) scanIntOrFloat(b []byte) (ast.Reference, []byte, error) { func (p *Parser) scanIntOrFloat(b []byte) (reference, []byte, error) {
i := 0 i := 0
if len(b) > 2 && b[0] == '0' && b[1] != '.' && b[1] != 'e' && b[1] != 'E' { if len(b) > 2 && b[0] == '0' && b[1] != '.' && b[1] != 'e' && b[1] != 'E' {
@@ -990,8 +1052,8 @@ func (p *parser) scanIntOrFloat(b []byte) (ast.Reference, []byte, error) {
} }
} }
return p.builder.Push(ast.Node{ return p.builder.Push(Node{
Kind: ast.Integer, Kind: Integer,
Data: b[:i], Data: b[:i],
}), b[i:], nil }), b[i:], nil
} }
@@ -1013,40 +1075,40 @@ func (p *parser) scanIntOrFloat(b []byte) (ast.Reference, []byte, error) {
if c == 'i' { if c == 'i' {
if scanFollowsInf(b[i:]) { if scanFollowsInf(b[i:]) {
return p.builder.Push(ast.Node{ return p.builder.Push(Node{
Kind: ast.Float, Kind: Float,
Data: b[:i+3], Data: b[:i+3],
}), b[i+3:], nil }), b[i+3:], nil
} }
return ast.InvalidReference, nil, newDecodeError(b[i:i+1], "unexpected character 'i' while scanning for a number") return invalidReference, nil, NewParserError(b[i:i+1], "unexpected character 'i' while scanning for a number")
} }
if c == 'n' { if c == 'n' {
if scanFollowsNan(b[i:]) { if scanFollowsNan(b[i:]) {
return p.builder.Push(ast.Node{ return p.builder.Push(Node{
Kind: ast.Float, Kind: Float,
Data: b[:i+3], Data: b[:i+3],
}), b[i+3:], nil }), b[i+3:], nil
} }
return ast.InvalidReference, nil, newDecodeError(b[i:i+1], "unexpected character 'n' while scanning for a number") return invalidReference, nil, NewParserError(b[i:i+1], "unexpected character 'n' while scanning for a number")
} }
break break
} }
if i == 0 { if i == 0 {
return ast.InvalidReference, b, newDecodeError(b, "incomplete number") return invalidReference, b, NewParserError(b, "incomplete number")
} }
kind := ast.Integer kind := Integer
if isFloat { if isFloat {
kind = ast.Float kind = Float
} }
return p.builder.Push(ast.Node{ return p.builder.Push(Node{
Kind: kind, Kind: kind,
Data: b[:i], Data: b[:i],
}), b[i:], nil }), b[i:], nil
@@ -1075,11 +1137,11 @@ func isValidBinaryRune(r byte) bool {
func expect(x byte, b []byte) ([]byte, error) { func expect(x byte, b []byte) ([]byte, error) {
if len(b) == 0 { if len(b) == 0 {
return nil, newDecodeError(b, "expected character %c but the document ended here", x) return nil, NewParserError(b, "expected character %c but the document ended here", x)
} }
if b[0] != x { if b[0] != x {
return nil, newDecodeError(b[0:1], "expected character %c", x) return nil, NewParserError(b[0:1], "expected character %c", x)
} }
return b[1:], nil return b[1:], nil
+94 -72
View File
@@ -1,143 +1,142 @@
package toml package unstable
import ( import (
"fmt"
"strconv" "strconv"
"strings" "strings"
"testing" "testing"
"github.com/pelletier/go-toml/v2/internal/ast"
"github.com/stretchr/testify/require" "github.com/stretchr/testify/require"
) )
//nolint:funlen
func TestParser_AST_Numbers(t *testing.T) { func TestParser_AST_Numbers(t *testing.T) {
examples := []struct { examples := []struct {
desc string desc string
input string input string
kind ast.Kind kind Kind
err bool err bool
}{ }{
{ {
desc: "integer just digits", desc: "integer just digits",
input: `1234`, input: `1234`,
kind: ast.Integer, kind: Integer,
}, },
{ {
desc: "integer zero", desc: "integer zero",
input: `0`, input: `0`,
kind: ast.Integer, kind: Integer,
}, },
{ {
desc: "integer sign", desc: "integer sign",
input: `+99`, input: `+99`,
kind: ast.Integer, kind: Integer,
}, },
{ {
desc: "integer hex uppercase", desc: "integer hex uppercase",
input: `0xDEADBEEF`, input: `0xDEADBEEF`,
kind: ast.Integer, kind: Integer,
}, },
{ {
desc: "integer hex lowercase", desc: "integer hex lowercase",
input: `0xdead_beef`, input: `0xdead_beef`,
kind: ast.Integer, kind: Integer,
}, },
{ {
desc: "integer octal", desc: "integer octal",
input: `0o01234567`, input: `0o01234567`,
kind: ast.Integer, kind: Integer,
}, },
{ {
desc: "integer binary", desc: "integer binary",
input: `0b11010110`, input: `0b11010110`,
kind: ast.Integer, kind: Integer,
}, },
{ {
desc: "float zero", desc: "float zero",
input: `0.0`, input: `0.0`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "float positive zero", desc: "float positive zero",
input: `+0.0`, input: `+0.0`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "float negative zero", desc: "float negative zero",
input: `-0.0`, input: `-0.0`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "float pi", desc: "float pi",
input: `3.1415`, input: `3.1415`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "float negative", desc: "float negative",
input: `-0.01`, input: `-0.01`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "float signed exponent", desc: "float signed exponent",
input: `5e+22`, input: `5e+22`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "float exponent lowercase", desc: "float exponent lowercase",
input: `1e06`, input: `1e06`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "float exponent uppercase", desc: "float exponent uppercase",
input: `-2E-2`, input: `-2E-2`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "float fractional with exponent", desc: "float fractional with exponent",
input: `6.626e-34`, input: `6.626e-34`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "float underscores", desc: "float underscores",
input: `224_617.445_991_228`, input: `224_617.445_991_228`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "inf", desc: "inf",
input: `inf`, input: `inf`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "inf negative", desc: "inf negative",
input: `-inf`, input: `-inf`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "inf positive", desc: "inf positive",
input: `+inf`, input: `+inf`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "nan", desc: "nan",
input: `nan`, input: `nan`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "nan negative", desc: "nan negative",
input: `-nan`, input: `-nan`,
kind: ast.Float, kind: Float,
}, },
{ {
desc: "nan positive", desc: "nan positive",
input: `+nan`, input: `+nan`,
kind: ast.Float, kind: Float,
}, },
} }
for _, e := range examples { for _, e := range examples {
e := e e := e
t.Run(e.desc, func(t *testing.T) { t.Run(e.desc, func(t *testing.T) {
p := parser{} p := Parser{}
p.Reset([]byte(`A = ` + e.input)) p.Reset([]byte(`A = ` + e.input))
p.NextExpression() p.NextExpression()
err := p.Error() err := p.Error()
@@ -147,10 +146,10 @@ func TestParser_AST_Numbers(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
expected := astNode{ expected := astNode{
Kind: ast.KeyValue, Kind: KeyValue,
Children: []astNode{ Children: []astNode{
{Kind: e.kind, Data: []byte(e.input)}, {Kind: e.kind, Data: []byte(e.input)},
{Kind: ast.Key, Data: []byte(`A`)}, {Kind: Key, Data: []byte(`A`)},
}, },
} }
compareNode(t, expected, p.Expression()) compareNode(t, expected, p.Expression())
@@ -161,13 +160,13 @@ func TestParser_AST_Numbers(t *testing.T) {
type ( type (
astNode struct { astNode struct {
Kind ast.Kind Kind Kind
Data []byte Data []byte
Children []astNode Children []astNode
} }
) )
func compareNode(t *testing.T, e astNode, n *ast.Node) { func compareNode(t *testing.T, e astNode, n *Node) {
t.Helper() t.Helper()
require.Equal(t, e.Kind, n.Kind) require.Equal(t, e.Kind, n.Kind)
require.Equal(t, e.Data, n.Data) require.Equal(t, e.Data, n.Data)
@@ -175,7 +174,7 @@ func compareNode(t *testing.T, e astNode, n *ast.Node) {
compareIterator(t, e.Children, n.Children()) compareIterator(t, e.Children, n.Children())
} }
func compareIterator(t *testing.T, expected []astNode, actual ast.Iterator) { func compareIterator(t *testing.T, expected []astNode, actual Iterator) {
t.Helper() t.Helper()
idx := 0 idx := 0
@@ -209,14 +208,14 @@ func TestParser_AST(t *testing.T) {
desc: "simple string assignment", desc: "simple string assignment",
input: `A = "hello"`, input: `A = "hello"`,
ast: astNode{ ast: astNode{
Kind: ast.KeyValue, Kind: KeyValue,
Children: []astNode{ Children: []astNode{
{ {
Kind: ast.String, Kind: String,
Data: []byte(`hello`), Data: []byte(`hello`),
}, },
{ {
Kind: ast.Key, Kind: Key,
Data: []byte(`A`), Data: []byte(`A`),
}, },
}, },
@@ -226,14 +225,14 @@ func TestParser_AST(t *testing.T) {
desc: "simple bool assignment", desc: "simple bool assignment",
input: `A = true`, input: `A = true`,
ast: astNode{ ast: astNode{
Kind: ast.KeyValue, Kind: KeyValue,
Children: []astNode{ Children: []astNode{
{ {
Kind: ast.Bool, Kind: Bool,
Data: []byte(`true`), Data: []byte(`true`),
}, },
{ {
Kind: ast.Key, Kind: Key,
Data: []byte(`A`), Data: []byte(`A`),
}, },
}, },
@@ -243,24 +242,24 @@ func TestParser_AST(t *testing.T) {
desc: "array of strings", desc: "array of strings",
input: `A = ["hello", ["world", "again"]]`, input: `A = ["hello", ["world", "again"]]`,
ast: astNode{ ast: astNode{
Kind: ast.KeyValue, Kind: KeyValue,
Children: []astNode{ Children: []astNode{
{ {
Kind: ast.Array, Kind: Array,
Children: []astNode{ Children: []astNode{
{ {
Kind: ast.String, Kind: String,
Data: []byte(`hello`), Data: []byte(`hello`),
}, },
{ {
Kind: ast.Array, Kind: Array,
Children: []astNode{ Children: []astNode{
{ {
Kind: ast.String, Kind: String,
Data: []byte(`world`), Data: []byte(`world`),
}, },
{ {
Kind: ast.String, Kind: String,
Data: []byte(`again`), Data: []byte(`again`),
}, },
}, },
@@ -268,7 +267,7 @@ func TestParser_AST(t *testing.T) {
}, },
}, },
{ {
Kind: ast.Key, Kind: Key,
Data: []byte(`A`), Data: []byte(`A`),
}, },
}, },
@@ -278,23 +277,23 @@ func TestParser_AST(t *testing.T) {
desc: "array of arrays of strings", desc: "array of arrays of strings",
input: `A = ["hello", "world"]`, input: `A = ["hello", "world"]`,
ast: astNode{ ast: astNode{
Kind: ast.KeyValue, Kind: KeyValue,
Children: []astNode{ Children: []astNode{
{ {
Kind: ast.Array, Kind: Array,
Children: []astNode{ Children: []astNode{
{ {
Kind: ast.String, Kind: String,
Data: []byte(`hello`), Data: []byte(`hello`),
}, },
{ {
Kind: ast.String, Kind: String,
Data: []byte(`world`), Data: []byte(`world`),
}, },
}, },
}, },
{ {
Kind: ast.Key, Kind: Key,
Data: []byte(`A`), Data: []byte(`A`),
}, },
}, },
@@ -304,29 +303,29 @@ func TestParser_AST(t *testing.T) {
desc: "inline table", desc: "inline table",
input: `name = { first = "Tom", last = "Preston-Werner" }`, input: `name = { first = "Tom", last = "Preston-Werner" }`,
ast: astNode{ ast: astNode{
Kind: ast.KeyValue, Kind: KeyValue,
Children: []astNode{ Children: []astNode{
{ {
Kind: ast.InlineTable, Kind: InlineTable,
Children: []astNode{ Children: []astNode{
{ {
Kind: ast.KeyValue, Kind: KeyValue,
Children: []astNode{ Children: []astNode{
{Kind: ast.String, Data: []byte(`Tom`)}, {Kind: String, Data: []byte(`Tom`)},
{Kind: ast.Key, Data: []byte(`first`)}, {Kind: Key, Data: []byte(`first`)},
}, },
}, },
{ {
Kind: ast.KeyValue, Kind: KeyValue,
Children: []astNode{ Children: []astNode{
{Kind: ast.String, Data: []byte(`Preston-Werner`)}, {Kind: String, Data: []byte(`Preston-Werner`)},
{Kind: ast.Key, Data: []byte(`last`)}, {Kind: Key, Data: []byte(`last`)},
}, },
}, },
}, },
}, },
{ {
Kind: ast.Key, Kind: Key,
Data: []byte(`name`), Data: []byte(`name`),
}, },
}, },
@@ -337,7 +336,7 @@ func TestParser_AST(t *testing.T) {
for _, e := range examples { for _, e := range examples {
e := e e := e
t.Run(e.desc, func(t *testing.T) { t.Run(e.desc, func(t *testing.T) {
p := parser{} p := Parser{}
p.Reset([]byte(e.input)) p.Reset([]byte(e.input))
p.NextExpression() p.NextExpression()
err := p.Error() err := p.Error()
@@ -352,7 +351,7 @@ func TestParser_AST(t *testing.T) {
} }
func BenchmarkParseBasicStringWithUnicode(b *testing.B) { func BenchmarkParseBasicStringWithUnicode(b *testing.B) {
p := &parser{} p := &Parser{}
b.Run("4", func(b *testing.B) { b.Run("4", func(b *testing.B) {
input := []byte(`"\u1234\u5678\u9ABC\u1234\u5678\u9ABC"`) input := []byte(`"\u1234\u5678\u9ABC\u1234\u5678\u9ABC"`)
b.ReportAllocs() b.ReportAllocs()
@@ -374,7 +373,7 @@ func BenchmarkParseBasicStringWithUnicode(b *testing.B) {
} }
func BenchmarkParseBasicStringsEasy(b *testing.B) { func BenchmarkParseBasicStringsEasy(b *testing.B) {
p := &parser{} p := &Parser{}
for _, size := range []int{1, 4, 8, 16, 21} { for _, size := range []int{1, 4, 8, 16, 21} {
b.Run(strconv.Itoa(size), func(b *testing.B) { b.Run(strconv.Itoa(size), func(b *testing.B) {
@@ -394,40 +393,40 @@ func TestParser_AST_DateTimes(t *testing.T) {
examples := []struct { examples := []struct {
desc string desc string
input string input string
kind ast.Kind kind Kind
err bool err bool
}{ }{
{ {
desc: "offset-date-time with delim 'T' and UTC offset", desc: "offset-date-time with delim 'T' and UTC offset",
input: `2021-07-21T12:08:05Z`, input: `2021-07-21T12:08:05Z`,
kind: ast.DateTime, kind: DateTime,
}, },
{ {
desc: "offset-date-time with space delim and +8hours offset", desc: "offset-date-time with space delim and +8hours offset",
input: `2021-07-21 12:08:05+08:00`, input: `2021-07-21 12:08:05+08:00`,
kind: ast.DateTime, kind: DateTime,
}, },
{ {
desc: "local-date-time with nano second", desc: "local-date-time with nano second",
input: `2021-07-21T12:08:05.666666666`, input: `2021-07-21T12:08:05.666666666`,
kind: ast.LocalDateTime, kind: LocalDateTime,
}, },
{ {
desc: "local-date-time", desc: "local-date-time",
input: `2021-07-21T12:08:05`, input: `2021-07-21T12:08:05`,
kind: ast.LocalDateTime, kind: LocalDateTime,
}, },
{ {
desc: "local-date", desc: "local-date",
input: `2021-07-21`, input: `2021-07-21`,
kind: ast.LocalDate, kind: LocalDate,
}, },
} }
for _, e := range examples { for _, e := range examples {
e := e e := e
t.Run(e.desc, func(t *testing.T) { t.Run(e.desc, func(t *testing.T) {
p := parser{} p := Parser{}
p.Reset([]byte(`A = ` + e.input)) p.Reset([]byte(`A = ` + e.input))
p.NextExpression() p.NextExpression()
err := p.Error() err := p.Error()
@@ -437,10 +436,10 @@ func TestParser_AST_DateTimes(t *testing.T) {
require.NoError(t, err) require.NoError(t, err)
expected := astNode{ expected := astNode{
Kind: ast.KeyValue, Kind: KeyValue,
Children: []astNode{ Children: []astNode{
{Kind: e.kind, Data: []byte(e.input)}, {Kind: e.kind, Data: []byte(e.input)},
{Kind: ast.Key, Data: []byte(`A`)}, {Kind: Key, Data: []byte(`A`)},
}, },
} }
compareNode(t, expected, p.Expression()) compareNode(t, expected, p.Expression())
@@ -448,3 +447,26 @@ func TestParser_AST_DateTimes(t *testing.T) {
}) })
} }
} }
func ExampleParser() {
doc := `
hello = "world"
value = 42
`
p := Parser{}
p.Reset([]byte(doc))
for p.NextExpression() {
e := p.Expression()
fmt.Printf("Expression: %s\n", e.Kind)
value := e.Value()
it := e.Key()
k := it.Node() // shortcut: we know there is no dotted key in the example
fmt.Printf("%s -> (%s) %s\n", k.Data, value.Kind, value.Data)
}
// Output:
// Expression: KeyValue
// hello -> (String) world
// Expression: KeyValue
// value -> (Integer) 42
}
+26 -24
View File
@@ -1,4 +1,6 @@
package toml package unstable
import "github.com/pelletier/go-toml/v2/internal/characters"
func scanFollows(b []byte, pattern string) bool { func scanFollows(b []byte, pattern string) bool {
n := len(pattern) n := len(pattern)
@@ -54,16 +56,16 @@ func scanLiteralString(b []byte) ([]byte, []byte, error) {
case '\'': case '\'':
return b[:i+1], b[i+1:], nil return b[:i+1], b[i+1:], nil
case '\n', '\r': case '\n', '\r':
return nil, nil, newDecodeError(b[i:i+1], "literal strings cannot have new lines") return nil, nil, NewParserError(b[i:i+1], "literal strings cannot have new lines")
} }
size := utf8ValidNext(b[i:]) size := characters.Utf8ValidNext(b[i:])
if size == 0 { if size == 0 {
return nil, nil, newDecodeError(b[i:i+1], "invalid character") return nil, nil, NewParserError(b[i:i+1], "invalid character")
} }
i += size i += size
} }
return nil, nil, newDecodeError(b[len(b):], "unterminated literal string") return nil, nil, NewParserError(b[len(b):], "unterminated literal string")
} }
func scanMultilineLiteralString(b []byte) ([]byte, []byte, error) { func scanMultilineLiteralString(b []byte) ([]byte, []byte, error) {
@@ -98,39 +100,39 @@ func scanMultilineLiteralString(b []byte) ([]byte, []byte, error) {
i++ i++
if i < len(b) && b[i] == '\'' { if i < len(b) && b[i] == '\'' {
return nil, nil, newDecodeError(b[i-3:i+1], "''' not allowed in multiline literal string") return nil, nil, NewParserError(b[i-3:i+1], "''' not allowed in multiline literal string")
} }
return b[:i], b[i:], nil return b[:i], b[i:], nil
} }
case '\r': case '\r':
if len(b) < i+2 { if len(b) < i+2 {
return nil, nil, newDecodeError(b[len(b):], `need a \n after \r`) return nil, nil, NewParserError(b[len(b):], `need a \n after \r`)
} }
if b[i+1] != '\n' { if b[i+1] != '\n' {
return nil, nil, newDecodeError(b[i:i+2], `need a \n after \r`) return nil, nil, NewParserError(b[i:i+2], `need a \n after \r`)
} }
i += 2 // skip the \n i += 2 // skip the \n
continue continue
} }
size := utf8ValidNext(b[i:]) size := characters.Utf8ValidNext(b[i:])
if size == 0 { if size == 0 {
return nil, nil, newDecodeError(b[i:i+1], "invalid character") return nil, nil, NewParserError(b[i:i+1], "invalid character")
} }
i += size i += size
} }
return nil, nil, newDecodeError(b[len(b):], `multiline literal string not terminated by '''`) return nil, nil, NewParserError(b[len(b):], `multiline literal string not terminated by '''`)
} }
func scanWindowsNewline(b []byte) ([]byte, []byte, error) { func scanWindowsNewline(b []byte) ([]byte, []byte, error) {
const lenCRLF = 2 const lenCRLF = 2
if len(b) < lenCRLF { if len(b) < lenCRLF {
return nil, nil, newDecodeError(b, "windows new line expected") return nil, nil, NewParserError(b, "windows new line expected")
} }
if b[1] != '\n' { if b[1] != '\n' {
return nil, nil, newDecodeError(b, `windows new line should be \r\n`) return nil, nil, NewParserError(b, `windows new line should be \r\n`)
} }
return b[:lenCRLF], b[lenCRLF:], nil return b[:lenCRLF], b[lenCRLF:], nil
@@ -165,11 +167,11 @@ func scanComment(b []byte) ([]byte, []byte, error) {
if i+1 < len(b) && b[i+1] == '\n' { if i+1 < len(b) && b[i+1] == '\n' {
return b[:i+1], b[i+1:], nil return b[:i+1], b[i+1:], nil
} }
return nil, nil, newDecodeError(b[i:i+1], "invalid character in comment") return nil, nil, NewParserError(b[i:i+1], "invalid character in comment")
} }
size := utf8ValidNext(b[i:]) size := characters.Utf8ValidNext(b[i:])
if size == 0 { if size == 0 {
return nil, nil, newDecodeError(b[i:i+1], "invalid character in comment") return nil, nil, NewParserError(b[i:i+1], "invalid character in comment")
} }
i += size i += size
@@ -192,17 +194,17 @@ func scanBasicString(b []byte) ([]byte, bool, []byte, error) {
case '"': case '"':
return b[:i+1], escaped, b[i+1:], nil return b[:i+1], escaped, b[i+1:], nil
case '\n', '\r': case '\n', '\r':
return nil, escaped, nil, newDecodeError(b[i:i+1], "basic strings cannot have new lines") return nil, escaped, nil, NewParserError(b[i:i+1], "basic strings cannot have new lines")
case '\\': case '\\':
if len(b) < i+2 { if len(b) < i+2 {
return nil, escaped, nil, newDecodeError(b[i:i+1], "need a character after \\") return nil, escaped, nil, NewParserError(b[i:i+1], "need a character after \\")
} }
escaped = true escaped = true
i++ // skip the next character i++ // skip the next character
} }
} }
return nil, escaped, nil, newDecodeError(b[len(b):], `basic string not terminated by "`) return nil, escaped, nil, NewParserError(b[len(b):], `basic string not terminated by "`)
} }
func scanMultilineBasicString(b []byte) ([]byte, bool, []byte, error) { func scanMultilineBasicString(b []byte) ([]byte, bool, []byte, error) {
@@ -243,27 +245,27 @@ func scanMultilineBasicString(b []byte) ([]byte, bool, []byte, error) {
i++ i++
if i < len(b) && b[i] == '"' { if i < len(b) && b[i] == '"' {
return nil, escaped, nil, newDecodeError(b[i-3:i+1], `""" not allowed in multiline basic string`) return nil, escaped, nil, NewParserError(b[i-3:i+1], `""" not allowed in multiline basic string`)
} }
return b[:i], escaped, b[i:], nil return b[:i], escaped, b[i:], nil
} }
case '\\': case '\\':
if len(b) < i+2 { if len(b) < i+2 {
return nil, escaped, nil, newDecodeError(b[len(b):], "need a character after \\") return nil, escaped, nil, NewParserError(b[len(b):], "need a character after \\")
} }
escaped = true escaped = true
i++ // skip the next character i++ // skip the next character
case '\r': case '\r':
if len(b) < i+2 { if len(b) < i+2 {
return nil, escaped, nil, newDecodeError(b[len(b):], `need a \n after \r`) return nil, escaped, nil, NewParserError(b[len(b):], `need a \n after \r`)
} }
if b[i+1] != '\n' { if b[i+1] != '\n' {
return nil, escaped, nil, newDecodeError(b[i:i+2], `need a \n after \r`) return nil, escaped, nil, NewParserError(b[i:i+2], `need a \n after \r`)
} }
i++ // skip the \n i++ // skip the \n
} }
} }
return nil, escaped, nil, newDecodeError(b[len(b):], `multiline basic string not terminated by """`) return nil, escaped, nil, NewParserError(b[len(b):], `multiline basic string not terminated by """`)
} }