➕ add TailwindCSS
+ a lot of node_modules?? unsure what happened
This commit is contained in:
		
							parent
							
								
									2ba37bfbe3
								
							
						
					
					
						commit
						bb41712ce4
					
				
					 1088 changed files with 224305 additions and 175 deletions
				
			
		
							
								
								
									
										21
									
								
								node_modules/sucrase/LICENSE
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										21
									
								
								node_modules/sucrase/LICENSE
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,21 @@ | |||
| The MIT License (MIT) | ||||
| 
 | ||||
| Copyright (c) 2012-2018 various contributors (see AUTHORS) | ||||
| 
 | ||||
| Permission is hereby granted, free of charge, to any person obtaining a copy | ||||
| of this software and associated documentation files (the "Software"), to deal | ||||
| in the Software without restriction, including without limitation the rights | ||||
| to use, copy, modify, merge, publish, distribute, sublicense, and/or sell | ||||
| copies of the Software, and to permit persons to whom the Software is | ||||
| furnished to do so, subject to the following conditions: | ||||
| 
 | ||||
| The above copyright notice and this permission notice shall be included in all | ||||
| copies or substantial portions of the Software. | ||||
| 
 | ||||
| THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR | ||||
| IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, | ||||
| FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE | ||||
| AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER | ||||
| LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, | ||||
| OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE | ||||
| SOFTWARE. | ||||
							
								
								
									
										295
									
								
								node_modules/sucrase/README.md
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										295
									
								
								node_modules/sucrase/README.md
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,295 @@ | |||
| # Sucrase | ||||
| 
 | ||||
| [](https://github.com/alangpierce/sucrase/actions) | ||||
| [](https://www.npmjs.com/package/sucrase) | ||||
| [](https://packagephobia.now.sh/result?p=sucrase) | ||||
| [](LICENSE) | ||||
| [](https://gitter.im/sucrasejs/Lobby) | ||||
| 
 | ||||
| ## [Try it out](https://sucrase.io) | ||||
| 
 | ||||
| ## Quick usage | ||||
| 
 | ||||
| ```bash | ||||
| yarn add --dev sucrase  # Or npm install --save-dev sucrase | ||||
| node -r sucrase/register main.ts | ||||
| ``` | ||||
| 
 | ||||
| Using the [ts-node](https://github.com/TypeStrong/ts-node) integration: | ||||
| 
 | ||||
| ```bash | ||||
| yarn add --dev sucrase ts-node typescript | ||||
| ./node_modules/.bin/ts-node --transpiler sucrase/ts-node-plugin main.ts | ||||
| ``` | ||||
| 
 | ||||
| ## Project overview | ||||
| 
 | ||||
| Sucrase is an alternative to Babel that allows super-fast development builds. | ||||
| Instead of compiling a large range of JS features to be able to work in Internet | ||||
| Explorer, Sucrase assumes that you're developing with a recent browser or recent | ||||
| Node.js version, so it focuses on compiling non-standard language extensions: | ||||
| JSX, TypeScript, and Flow. Because of this smaller scope, Sucrase can get away | ||||
| with an architecture that is much more performant but less extensible and | ||||
| maintainable. Sucrase's parser is forked from Babel's parser (so Sucrase is | ||||
| indebted to Babel and wouldn't be possible without it) and trims it down to a | ||||
| focused subset of what Babel solves. If it fits your use case, hopefully Sucrase | ||||
| can speed up your development experience! | ||||
| 
 | ||||
| **Sucrase has been extensively tested.** It can successfully build | ||||
| the [Benchling](https://benchling.com/) frontend code, | ||||
| [Babel](https://github.com/babel/babel), | ||||
| [React](https://github.com/facebook/react), | ||||
| [TSLint](https://github.com/palantir/tslint), | ||||
| [Apollo client](https://github.com/apollographql/apollo-client), and | ||||
| [decaffeinate](https://github.com/decaffeinate/decaffeinate) | ||||
| with all tests passing, about 1 million lines of code total. | ||||
| 
 | ||||
| **Sucrase is about 20x faster than Babel.** Here's one measurement of how | ||||
| Sucrase compares with other tools when compiling the Jest codebase 3 times, | ||||
| about 360k lines of code total: | ||||
| 
 | ||||
| ```text | ||||
|             Time            Speed | ||||
| Sucrase     0.57 seconds    636975 lines per second | ||||
| swc         1.19 seconds    304526 lines per second | ||||
| esbuild     1.45 seconds    248692 lines per second | ||||
| TypeScript  8.98 seconds    40240 lines per second | ||||
| Babel       9.18 seconds    39366 lines per second | ||||
| ``` | ||||
| 
 | ||||
| Details: Measured on July 2022. Tools run in single-threaded mode without warm-up. See the | ||||
| [benchmark code](https://github.com/alangpierce/sucrase/blob/main/benchmark/benchmark.ts) | ||||
| for methodology and caveats. | ||||
| 
 | ||||
| ## Transforms | ||||
| 
 | ||||
| The main configuration option in Sucrase is an array of transform names. These | ||||
| transforms are available: | ||||
| 
 | ||||
| * **jsx**: Enables JSX syntax. By default, JSX is transformed to `React.createClass`, | ||||
|   but may be preserved or transformed to `_jsx()` by setting the `jsxRuntime` option. | ||||
|   Also adds `createReactClass` display names and JSX context information. | ||||
| * **typescript**: Compiles TypeScript code to JavaScript, removing type | ||||
|   annotations and handling features like enums. Does not check types. Sucrase | ||||
|   transforms each file independently, so you should enable the `isolatedModules` | ||||
|   TypeScript flag so that the typechecker will disallow the few features like | ||||
|   `const enum`s that need cross-file compilation. The Sucrase option `keepUnusedImports` | ||||
|   can be used to disable all automatic removal of imports and exports, analogous to TS | ||||
|   `verbatimModuleSyntax`. | ||||
| * **flow**:  Removes Flow type annotations. Does not check types. | ||||
| * **imports**: Transforms ES Modules (`import`/`export`) to CommonJS | ||||
|   (`require`/`module.exports`) using the same approach as Babel and TypeScript | ||||
|   with `--esModuleInterop`. If `preserveDynamicImport` is specified in the Sucrase | ||||
|   options, then dynamic `import` expressions are left alone, which is particularly | ||||
|   useful in Node to load ESM-only libraries. If `preserveDynamicImport` is not | ||||
|   specified, `import` expressions are transformed into a promise-wrapped call to | ||||
|   `require`. | ||||
| * **react-hot-loader**: Performs the equivalent of the `react-hot-loader/babel` | ||||
|   transform in the [react-hot-loader](https://github.com/gaearon/react-hot-loader) | ||||
|   project. This enables advanced hot reloading use cases such as editing of | ||||
|   bound methods. | ||||
| * **jest**: Hoist desired [jest](https://jestjs.io/) method calls above imports in | ||||
|   the same way as [babel-plugin-jest-hoist](https://github.com/facebook/jest/tree/master/packages/babel-plugin-jest-hoist). | ||||
|   Does not validate the arguments passed to `jest.mock`, but the same rules still apply. | ||||
| 
 | ||||
| When the `imports` transform is *not* specified (i.e. when targeting ESM), the | ||||
| `injectCreateRequireForImportRequire` option can be specified to transform TS | ||||
| `import foo = require("foo");` in a way that matches the | ||||
| [TypeScript 4.7 behavior](https://devblogs.microsoft.com/typescript/announcing-typescript-4-7/#commonjs-interoperability) | ||||
| with `module: nodenext`. | ||||
| 
 | ||||
| These newer JS features are transformed by default: | ||||
| 
 | ||||
| * [Optional chaining](https://github.com/tc39/proposal-optional-chaining): `a?.b` | ||||
| * [Nullish coalescing](https://github.com/tc39/proposal-nullish-coalescing): `a ?? b` | ||||
| * [Class fields](https://github.com/tc39/proposal-class-fields): `class C { x = 1; }`. | ||||
|   This includes static fields but not the `#x` private field syntax. | ||||
| * [Numeric separators](https://github.com/tc39/proposal-numeric-separator): | ||||
|   `const n = 1_234;` | ||||
| * [Optional catch binding](https://github.com/tc39/proposal-optional-catch-binding): | ||||
|   `try { doThing(); } catch { }`. | ||||
| 
 | ||||
| If your target runtime supports these features, you can specify | ||||
| `disableESTransforms: true` so that Sucrase preserves the syntax rather than | ||||
| trying to transform it. Note that transpiled and standard class fields behave | ||||
| slightly differently; see the | ||||
| [TypeScript 3.7 release notes](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-3-7.html#the-usedefineforclassfields-flag-and-the-declare-property-modifier) | ||||
| for details. If you use TypeScript, you can enable the TypeScript option | ||||
| `useDefineForClassFields` to enable error checking related to these differences. | ||||
| 
 | ||||
| ### Unsupported syntax | ||||
| 
 | ||||
| All JS syntax not mentioned above will "pass through" and needs to be supported | ||||
| by your JS runtime. For example: | ||||
| 
 | ||||
| * Decorators, private fields, `throw` expressions, generator arrow functions, | ||||
|   and `do` expressions are all unsupported in browsers and Node (as of this | ||||
|   writing), and Sucrase doesn't make an attempt to transpile them. | ||||
| * Object rest/spread, async functions, and async iterators are all recent | ||||
|   features that should work fine, but might cause issues if you use older | ||||
|   versions of tools like webpack. BigInt and newer regex features may or may not | ||||
|   work, based on your tooling. | ||||
| 
 | ||||
| ### JSX Options | ||||
| 
 | ||||
| By default, JSX is compiled to React functions in development mode. This can be | ||||
| configured with a few options: | ||||
| 
 | ||||
| * **jsxRuntime**: A string specifying the transform mode, which can be one of three values: | ||||
|   * `"classic"` (default): The original JSX transform that calls `React.createElement` by default. | ||||
|     To configure for non-React use cases, specify: | ||||
|     * **jsxPragma**: Element creation function, defaults to `React.createElement`. | ||||
|     * **jsxFragmentPragma**: Fragment component, defaults to `React.Fragment`. | ||||
|   * `"automatic"`: The [new JSX transform](https://reactjs.org/blog/2020/09/22/introducing-the-new-jsx-transform.html) | ||||
|       introduced with React 17, which calls `jsx` functions and auto-adds import statements. | ||||
|     To configure for non-React use cases, specify: | ||||
|     * **jsxImportSource**: Package name for auto-generated import statements, defaults to `react`. | ||||
|   * `"preserve"`: Don't transform JSX, and instead emit it as-is in the output code. | ||||
| * **production**: If `true`, use production version of functions and don't include debugging | ||||
|   information. When using React in production mode with the automatic transform, this *must* be | ||||
|   set to true to avoid an error about `jsxDEV` being missing. | ||||
| 
 | ||||
| ### Legacy CommonJS interop | ||||
| 
 | ||||
| Two legacy modes can be used with the `imports` transform: | ||||
| 
 | ||||
| * **enableLegacyTypeScriptModuleInterop**: Use the default TypeScript approach | ||||
|   to CommonJS interop instead of assuming that TypeScript's `--esModuleInterop` | ||||
|   flag is enabled. For example, if a CJS module exports a function, legacy | ||||
|   TypeScript interop requires you to write `import * as add from './add';`, | ||||
|   while Babel, Webpack, Node.js, and TypeScript with `--esModuleInterop` require | ||||
|   you to write `import add from './add';`. As mentioned in the | ||||
|   [docs](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-2-7.html#support-for-import-d-from-cjs-form-commonjs-modules-with---esmoduleinterop), | ||||
|   the TypeScript team recommends you always use `--esModuleInterop`. | ||||
| * **enableLegacyBabel5ModuleInterop**: Use the Babel 5 approach to CommonJS | ||||
|   interop, so that you can run `require('./MyModule')` instead of | ||||
|   `require('./MyModule').default`. Analogous to | ||||
|   [babel-plugin-add-module-exports](https://github.com/59naga/babel-plugin-add-module-exports). | ||||
| 
 | ||||
| ## Usage | ||||
| 
 | ||||
| ### Tool integrations | ||||
| 
 | ||||
| * [Webpack](https://github.com/alangpierce/sucrase/tree/main/integrations/webpack-loader) | ||||
| * [Gulp](https://github.com/alangpierce/sucrase/tree/main/integrations/gulp-plugin) | ||||
| * [Jest](https://github.com/alangpierce/sucrase/tree/main/integrations/jest-plugin) | ||||
| * [Rollup](https://github.com/rollup/plugins/tree/master/packages/sucrase) | ||||
| * [Broccoli](https://github.com/stefanpenner/broccoli-sucrase) | ||||
| 
 | ||||
| ### Usage in Node | ||||
| 
 | ||||
| The most robust way is to use the Sucrase plugin for [ts-node](https://github.com/TypeStrong/ts-node), | ||||
| which has various Node integrations and configures Sucrase via `tsconfig.json`: | ||||
| ```bash | ||||
| ts-node --transpiler sucrase/ts-node-plugin | ||||
| ``` | ||||
| 
 | ||||
| For projects that don't target ESM, Sucrase also has a require hook with some | ||||
| reasonable defaults that can be accessed in a few ways: | ||||
| 
 | ||||
| * From code: `require("sucrase/register");` | ||||
| * When invoking Node: `node -r sucrase/register main.ts` | ||||
| * As a separate binary: `sucrase-node main.ts` | ||||
| 
 | ||||
| Options can be passed to the require hook via a `SUCRASE_OPTIONS` environment | ||||
| variable holding a JSON string of options. | ||||
| 
 | ||||
| ### Compiling a project to JS | ||||
| 
 | ||||
| For simple use cases, Sucrase comes with a `sucrase` CLI that mirrors your | ||||
| directory structure to an output directory: | ||||
| ```bash | ||||
| sucrase ./srcDir -d ./outDir --transforms typescript,imports | ||||
| ``` | ||||
| 
 | ||||
| ### Usage from code | ||||
| 
 | ||||
| For any advanced use cases, Sucrase can be called from JS directly: | ||||
| 
 | ||||
| ```js | ||||
| import {transform} from "sucrase"; | ||||
| const compiledCode = transform(code, {transforms: ["typescript", "imports"]}).code; | ||||
| ``` | ||||
| 
 | ||||
| ## What Sucrase is not | ||||
| 
 | ||||
| Sucrase is intended to be useful for the most common cases, but it does not aim | ||||
| to have nearly the scope and versatility of Babel. Some specific examples: | ||||
| 
 | ||||
| * Sucrase does not check your code for errors. Sucrase's contract is that if you | ||||
|   give it valid code, it will produce valid JS code. If you give it invalid | ||||
|   code, it might produce invalid code, it might produce valid code, or it might | ||||
|   give an error. Always use Sucrase with a linter or typechecker, which is more | ||||
|   suited for error-checking. | ||||
| * Sucrase is not pluginizable. With the current architecture, transforms need to | ||||
|   be explicitly written to cooperate with each other, so each additional | ||||
|   transform takes significant extra work. | ||||
| * Sucrase is not good for prototyping language extensions and upcoming language | ||||
|   features. Its faster architecture makes new transforms more difficult to write | ||||
|   and more fragile. | ||||
| * Sucrase will never produce code for old browsers like IE. Compiling code down | ||||
|   to ES5 is much more complicated than any transformation that Sucrase needs to | ||||
|   do. | ||||
| * Sucrase is hesitant to implement upcoming JS features, although some of them | ||||
|   make sense to implement for pragmatic reasons. Its main focus is on language | ||||
|   extensions (JSX, TypeScript, Flow) that will never be supported by JS | ||||
|   runtimes. | ||||
| * Like Babel, Sucrase is not a typechecker, and must process each file in | ||||
|   isolation. For example, TypeScript `const enum`s are treated as regular | ||||
|   `enum`s rather than inlining across files. | ||||
| * You should think carefully before using Sucrase in production. Sucrase is | ||||
|   mostly beneficial in development, and in many cases, Babel or tsc will be more | ||||
|   suitable for production builds. | ||||
| 
 | ||||
| See the [Project Vision](./docs/PROJECT_VISION.md) document for more details on | ||||
| the philosophy behind Sucrase. | ||||
| 
 | ||||
| ## Motivation | ||||
| 
 | ||||
| As JavaScript implementations mature, it becomes more and more reasonable to | ||||
| disable Babel transforms, especially in development when you know that you're | ||||
| targeting a modern runtime. You might hope that you could simplify and speed up | ||||
| the build step by eventually disabling Babel entirely, but this isn't possible | ||||
| if you're using a non-standard language extension like JSX, TypeScript, or Flow. | ||||
| Unfortunately, disabling most transforms in Babel doesn't speed it up as much as | ||||
| you might expect. To understand, let's take a look at how Babel works: | ||||
| 
 | ||||
| 1. Tokenize the input source code into a token stream. | ||||
| 2. Parse the token stream into an AST. | ||||
| 3. Walk the AST to compute the scope information for each variable. | ||||
| 4. Apply all transform plugins in a single traversal, resulting in a new AST. | ||||
| 5. Print the resulting AST. | ||||
| 
 | ||||
| Only step 4 gets faster when disabling plugins, so there's always a fixed cost | ||||
| to running Babel regardless of how many transforms are enabled. | ||||
| 
 | ||||
| Sucrase bypasses most of these steps, and works like this: | ||||
| 
 | ||||
| 1. Tokenize the input source code into a token stream using a trimmed-down fork | ||||
|    of the Babel parser. This fork does not produce a full AST, but still | ||||
|    produces meaningful token metadata specifically designed for the later | ||||
|    transforms. | ||||
| 2. Scan through the tokens, computing preliminary information like all | ||||
|    imported/exported names. | ||||
| 3. Run the transform by doing a pass through the tokens and performing a number | ||||
|    of careful find-and-replace operations, like replacing `<Foo` with | ||||
|    `React.createElement(Foo`. | ||||
| 
 | ||||
| Because Sucrase works on a lower level and uses a custom parser for its use | ||||
| case, it is much faster than Babel. | ||||
| 
 | ||||
| ## Contributing | ||||
| 
 | ||||
| Contributions are welcome, whether they be bug reports, PRs, docs, tests, or | ||||
| anything else! Please take a look through the [Contributing Guide](./CONTRIBUTING.md) | ||||
| to learn how to get started. | ||||
| 
 | ||||
| ## License and attribution | ||||
| 
 | ||||
| Sucrase is MIT-licensed. A large part of Sucrase is based on a fork of the | ||||
| [Babel parser](https://github.com/babel/babel/tree/main/packages/babel-parser), | ||||
| which is also MIT-licensed. | ||||
| 
 | ||||
| ## Why the name? | ||||
| 
 | ||||
| Sucrase is an enzyme that processes sugar. Get it? | ||||
							
								
								
									
										3
									
								
								node_modules/sucrase/bin/sucrase
									
										
									
										generated
									
									
										vendored
									
									
										Executable file
									
								
							
							
						
						
									
										3
									
								
								node_modules/sucrase/bin/sucrase
									
										
									
										generated
									
									
										vendored
									
									
										Executable file
									
								
							|  | @ -0,0 +1,3 @@ | |||
| #!/usr/bin/env node | ||||
| 
 | ||||
| require("../dist/cli").default(); | ||||
							
								
								
									
										18
									
								
								node_modules/sucrase/bin/sucrase-node
									
										
									
										generated
									
									
										vendored
									
									
										Executable file
									
								
							
							
						
						
									
										18
									
								
								node_modules/sucrase/bin/sucrase-node
									
										
									
										generated
									
									
										vendored
									
									
										Executable file
									
								
							|  | @ -0,0 +1,18 @@ | |||
| #!/usr/bin/env node | ||||
| const Module = require("module"); | ||||
| const {resolve} = require("path"); | ||||
| 
 | ||||
| /* | ||||
|  * Simple wrapper around node that first registers Sucrase with default settings. | ||||
|  * | ||||
|  * This is meant for simple use cases, and doesn't support custom Node/V8 args, | ||||
|  * executing a code snippet, a REPL, or other things that you might find in | ||||
|  * node, babel-node, or ts-node. For more advanced use cases, you can use | ||||
|  * `node -r sucrase/register` or register a require hook programmatically from | ||||
|  * your own code. | ||||
|  */ | ||||
| require("../register"); | ||||
| 
 | ||||
| process.argv.splice(1, 1); | ||||
| process.argv[1] = resolve(process.argv[1]); | ||||
| Module.runMain(); | ||||
							
								
								
									
										456
									
								
								node_modules/sucrase/dist/CJSImportProcessor.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										456
									
								
								node_modules/sucrase/dist/CJSImportProcessor.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,456 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; } | ||||
| 
 | ||||
| 
 | ||||
| var _tokenizer = require('./parser/tokenizer'); | ||||
| var _keywords = require('./parser/tokenizer/keywords'); | ||||
| var _types = require('./parser/tokenizer/types'); | ||||
| 
 | ||||
| var _getImportExportSpecifierInfo = require('./util/getImportExportSpecifierInfo'); var _getImportExportSpecifierInfo2 = _interopRequireDefault(_getImportExportSpecifierInfo); | ||||
| var _getNonTypeIdentifiers = require('./util/getNonTypeIdentifiers'); | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| /** | ||||
|  * Class responsible for preprocessing and bookkeeping import and export declarations within the | ||||
|  * file. | ||||
|  * | ||||
|  * TypeScript uses a simpler mechanism that does not use functions like interopRequireDefault and | ||||
|  * interopRequireWildcard, so we also allow that mode for compatibility. | ||||
|  */ | ||||
|  class CJSImportProcessor { | ||||
|    __init() {this.nonTypeIdentifiers = new Set()} | ||||
|    __init2() {this.importInfoByPath = new Map()} | ||||
|    __init3() {this.importsToReplace = new Map()} | ||||
|    __init4() {this.identifierReplacements = new Map()} | ||||
|    __init5() {this.exportBindingsByLocalName = new Map()} | ||||
| 
 | ||||
|   constructor( | ||||
|      nameManager, | ||||
|      tokens, | ||||
|      enableLegacyTypeScriptModuleInterop, | ||||
|      options, | ||||
|      isTypeScriptTransformEnabled, | ||||
|      keepUnusedImports, | ||||
|      helperManager, | ||||
|   ) {;this.nameManager = nameManager;this.tokens = tokens;this.enableLegacyTypeScriptModuleInterop = enableLegacyTypeScriptModuleInterop;this.options = options;this.isTypeScriptTransformEnabled = isTypeScriptTransformEnabled;this.keepUnusedImports = keepUnusedImports;this.helperManager = helperManager;CJSImportProcessor.prototype.__init.call(this);CJSImportProcessor.prototype.__init2.call(this);CJSImportProcessor.prototype.__init3.call(this);CJSImportProcessor.prototype.__init4.call(this);CJSImportProcessor.prototype.__init5.call(this);} | ||||
| 
 | ||||
|   preprocessTokens() { | ||||
|     for (let i = 0; i < this.tokens.tokens.length; i++) { | ||||
|       if ( | ||||
|         this.tokens.matches1AtIndex(i, _types.TokenType._import) && | ||||
|         !this.tokens.matches3AtIndex(i, _types.TokenType._import, _types.TokenType.name, _types.TokenType.eq) | ||||
|       ) { | ||||
|         this.preprocessImportAtIndex(i); | ||||
|       } | ||||
|       if ( | ||||
|         this.tokens.matches1AtIndex(i, _types.TokenType._export) && | ||||
|         !this.tokens.matches2AtIndex(i, _types.TokenType._export, _types.TokenType.eq) | ||||
|       ) { | ||||
|         this.preprocessExportAtIndex(i); | ||||
|       } | ||||
|     } | ||||
|     this.generateImportReplacements(); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * In TypeScript, import statements that only import types should be removed. | ||||
|    * This includes `import {} from 'foo';`, but not `import 'foo';`. | ||||
|    */ | ||||
|   pruneTypeOnlyImports() { | ||||
|     this.nonTypeIdentifiers = _getNonTypeIdentifiers.getNonTypeIdentifiers.call(void 0, this.tokens, this.options); | ||||
|     for (const [path, importInfo] of this.importInfoByPath.entries()) { | ||||
|       if ( | ||||
|         importInfo.hasBareImport || | ||||
|         importInfo.hasStarExport || | ||||
|         importInfo.exportStarNames.length > 0 || | ||||
|         importInfo.namedExports.length > 0 | ||||
|       ) { | ||||
|         continue; | ||||
|       } | ||||
|       const names = [ | ||||
|         ...importInfo.defaultNames, | ||||
|         ...importInfo.wildcardNames, | ||||
|         ...importInfo.namedImports.map(({localName}) => localName), | ||||
|       ]; | ||||
|       if (names.every((name) => this.shouldAutomaticallyElideImportedName(name))) { | ||||
|         this.importsToReplace.set(path, ""); | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   shouldAutomaticallyElideImportedName(name) { | ||||
|     return ( | ||||
|       this.isTypeScriptTransformEnabled && | ||||
|       !this.keepUnusedImports && | ||||
|       !this.nonTypeIdentifiers.has(name) | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|    generateImportReplacements() { | ||||
|     for (const [path, importInfo] of this.importInfoByPath.entries()) { | ||||
|       const { | ||||
|         defaultNames, | ||||
|         wildcardNames, | ||||
|         namedImports, | ||||
|         namedExports, | ||||
|         exportStarNames, | ||||
|         hasStarExport, | ||||
|       } = importInfo; | ||||
| 
 | ||||
|       if ( | ||||
|         defaultNames.length === 0 && | ||||
|         wildcardNames.length === 0 && | ||||
|         namedImports.length === 0 && | ||||
|         namedExports.length === 0 && | ||||
|         exportStarNames.length === 0 && | ||||
|         !hasStarExport | ||||
|       ) { | ||||
|         // Import is never used, so don't even assign a name.
 | ||||
|         this.importsToReplace.set(path, `require('${path}');`); | ||||
|         continue; | ||||
|       } | ||||
| 
 | ||||
|       const primaryImportName = this.getFreeIdentifierForPath(path); | ||||
|       let secondaryImportName; | ||||
|       if (this.enableLegacyTypeScriptModuleInterop) { | ||||
|         secondaryImportName = primaryImportName; | ||||
|       } else { | ||||
|         secondaryImportName = | ||||
|           wildcardNames.length > 0 ? wildcardNames[0] : this.getFreeIdentifierForPath(path); | ||||
|       } | ||||
|       let requireCode = `var ${primaryImportName} = require('${path}');`; | ||||
|       if (wildcardNames.length > 0) { | ||||
|         for (const wildcardName of wildcardNames) { | ||||
|           const moduleExpr = this.enableLegacyTypeScriptModuleInterop | ||||
|             ? primaryImportName | ||||
|             : `${this.helperManager.getHelperName("interopRequireWildcard")}(${primaryImportName})`; | ||||
|           requireCode += ` var ${wildcardName} = ${moduleExpr};`; | ||||
|         } | ||||
|       } else if (exportStarNames.length > 0 && secondaryImportName !== primaryImportName) { | ||||
|         requireCode += ` var ${secondaryImportName} = ${this.helperManager.getHelperName( | ||||
|           "interopRequireWildcard", | ||||
|         )}(${primaryImportName});`;
 | ||||
|       } else if (defaultNames.length > 0 && secondaryImportName !== primaryImportName) { | ||||
|         requireCode += ` var ${secondaryImportName} = ${this.helperManager.getHelperName( | ||||
|           "interopRequireDefault", | ||||
|         )}(${primaryImportName});`;
 | ||||
|       } | ||||
| 
 | ||||
|       for (const {importedName, localName} of namedExports) { | ||||
|         requireCode += ` ${this.helperManager.getHelperName( | ||||
|           "createNamedExportFrom", | ||||
|         )}(${primaryImportName}, '${localName}', '${importedName}');`;
 | ||||
|       } | ||||
|       for (const exportStarName of exportStarNames) { | ||||
|         requireCode += ` exports.${exportStarName} = ${secondaryImportName};`; | ||||
|       } | ||||
|       if (hasStarExport) { | ||||
|         requireCode += ` ${this.helperManager.getHelperName( | ||||
|           "createStarExport", | ||||
|         )}(${primaryImportName});`;
 | ||||
|       } | ||||
| 
 | ||||
|       this.importsToReplace.set(path, requireCode); | ||||
| 
 | ||||
|       for (const defaultName of defaultNames) { | ||||
|         this.identifierReplacements.set(defaultName, `${secondaryImportName}.default`); | ||||
|       } | ||||
|       for (const {importedName, localName} of namedImports) { | ||||
|         this.identifierReplacements.set(localName, `${primaryImportName}.${importedName}`); | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   getFreeIdentifierForPath(path) { | ||||
|     const components = path.split("/"); | ||||
|     const lastComponent = components[components.length - 1]; | ||||
|     const baseName = lastComponent.replace(/\W/g, ""); | ||||
|     return this.nameManager.claimFreeName(`_${baseName}`); | ||||
|   } | ||||
| 
 | ||||
|    preprocessImportAtIndex(index) { | ||||
|     const defaultNames = []; | ||||
|     const wildcardNames = []; | ||||
|     const namedImports = []; | ||||
| 
 | ||||
|     index++; | ||||
|     if ( | ||||
|       (this.tokens.matchesContextualAtIndex(index, _keywords.ContextualKeyword._type) || | ||||
|         this.tokens.matches1AtIndex(index, _types.TokenType._typeof)) && | ||||
|       !this.tokens.matches1AtIndex(index + 1, _types.TokenType.comma) && | ||||
|       !this.tokens.matchesContextualAtIndex(index + 1, _keywords.ContextualKeyword._from) | ||||
|     ) { | ||||
|       // import type declaration, so no need to process anything.
 | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1AtIndex(index, _types.TokenType.parenL)) { | ||||
|       // Dynamic import, so nothing to do
 | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1AtIndex(index, _types.TokenType.name)) { | ||||
|       defaultNames.push(this.tokens.identifierNameAtIndex(index)); | ||||
|       index++; | ||||
|       if (this.tokens.matches1AtIndex(index, _types.TokenType.comma)) { | ||||
|         index++; | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1AtIndex(index, _types.TokenType.star)) { | ||||
|       // * as
 | ||||
|       index += 2; | ||||
|       wildcardNames.push(this.tokens.identifierNameAtIndex(index)); | ||||
|       index++; | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1AtIndex(index, _types.TokenType.braceL)) { | ||||
|       const result = this.getNamedImports(index + 1); | ||||
|       index = result.newIndex; | ||||
| 
 | ||||
|       for (const namedImport of result.namedImports) { | ||||
|         // Treat {default as X} as a default import to ensure usage of require interop helper
 | ||||
|         if (namedImport.importedName === "default") { | ||||
|           defaultNames.push(namedImport.localName); | ||||
|         } else { | ||||
|           namedImports.push(namedImport); | ||||
|         } | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matchesContextualAtIndex(index, _keywords.ContextualKeyword._from)) { | ||||
|       index++; | ||||
|     } | ||||
| 
 | ||||
|     if (!this.tokens.matches1AtIndex(index, _types.TokenType.string)) { | ||||
|       throw new Error("Expected string token at the end of import statement."); | ||||
|     } | ||||
|     const path = this.tokens.stringValueAtIndex(index); | ||||
|     const importInfo = this.getImportInfo(path); | ||||
|     importInfo.defaultNames.push(...defaultNames); | ||||
|     importInfo.wildcardNames.push(...wildcardNames); | ||||
|     importInfo.namedImports.push(...namedImports); | ||||
|     if (defaultNames.length === 0 && wildcardNames.length === 0 && namedImports.length === 0) { | ||||
|       importInfo.hasBareImport = true; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    preprocessExportAtIndex(index) { | ||||
|     if ( | ||||
|       this.tokens.matches2AtIndex(index, _types.TokenType._export, _types.TokenType._var) || | ||||
|       this.tokens.matches2AtIndex(index, _types.TokenType._export, _types.TokenType._let) || | ||||
|       this.tokens.matches2AtIndex(index, _types.TokenType._export, _types.TokenType._const) | ||||
|     ) { | ||||
|       this.preprocessVarExportAtIndex(index); | ||||
|     } else if ( | ||||
|       this.tokens.matches2AtIndex(index, _types.TokenType._export, _types.TokenType._function) || | ||||
|       this.tokens.matches2AtIndex(index, _types.TokenType._export, _types.TokenType._class) | ||||
|     ) { | ||||
|       const exportName = this.tokens.identifierNameAtIndex(index + 2); | ||||
|       this.addExportBinding(exportName, exportName); | ||||
|     } else if (this.tokens.matches3AtIndex(index, _types.TokenType._export, _types.TokenType.name, _types.TokenType._function)) { | ||||
|       const exportName = this.tokens.identifierNameAtIndex(index + 3); | ||||
|       this.addExportBinding(exportName, exportName); | ||||
|     } else if (this.tokens.matches2AtIndex(index, _types.TokenType._export, _types.TokenType.braceL)) { | ||||
|       this.preprocessNamedExportAtIndex(index); | ||||
|     } else if (this.tokens.matches2AtIndex(index, _types.TokenType._export, _types.TokenType.star)) { | ||||
|       this.preprocessExportStarAtIndex(index); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    preprocessVarExportAtIndex(index) { | ||||
|     let depth = 0; | ||||
|     // Handle cases like `export let {x} = y;`, starting at the open-brace in that case.
 | ||||
|     for (let i = index + 2; ; i++) { | ||||
|       if ( | ||||
|         this.tokens.matches1AtIndex(i, _types.TokenType.braceL) || | ||||
|         this.tokens.matches1AtIndex(i, _types.TokenType.dollarBraceL) || | ||||
|         this.tokens.matches1AtIndex(i, _types.TokenType.bracketL) | ||||
|       ) { | ||||
|         depth++; | ||||
|       } else if ( | ||||
|         this.tokens.matches1AtIndex(i, _types.TokenType.braceR) || | ||||
|         this.tokens.matches1AtIndex(i, _types.TokenType.bracketR) | ||||
|       ) { | ||||
|         depth--; | ||||
|       } else if (depth === 0 && !this.tokens.matches1AtIndex(i, _types.TokenType.name)) { | ||||
|         break; | ||||
|       } else if (this.tokens.matches1AtIndex(1, _types.TokenType.eq)) { | ||||
|         const endIndex = this.tokens.currentToken().rhsEndIndex; | ||||
|         if (endIndex == null) { | ||||
|           throw new Error("Expected = token with an end index."); | ||||
|         } | ||||
|         i = endIndex - 1; | ||||
|       } else { | ||||
|         const token = this.tokens.tokens[i]; | ||||
|         if (_tokenizer.isDeclaration.call(void 0, token)) { | ||||
|           const exportName = this.tokens.identifierNameAtIndex(i); | ||||
|           this.identifierReplacements.set(exportName, `exports.${exportName}`); | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Walk this export statement just in case it's an export...from statement. | ||||
|    * If it is, combine it into the import info for that path. Otherwise, just | ||||
|    * bail out; it'll be handled later. | ||||
|    */ | ||||
|    preprocessNamedExportAtIndex(index) { | ||||
|     // export {
 | ||||
|     index += 2; | ||||
|     const {newIndex, namedImports} = this.getNamedImports(index); | ||||
|     index = newIndex; | ||||
| 
 | ||||
|     if (this.tokens.matchesContextualAtIndex(index, _keywords.ContextualKeyword._from)) { | ||||
|       index++; | ||||
|     } else { | ||||
|       // Reinterpret "a as b" to be local/exported rather than imported/local.
 | ||||
|       for (const {importedName: localName, localName: exportedName} of namedImports) { | ||||
|         this.addExportBinding(localName, exportedName); | ||||
|       } | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     if (!this.tokens.matches1AtIndex(index, _types.TokenType.string)) { | ||||
|       throw new Error("Expected string token at the end of import statement."); | ||||
|     } | ||||
|     const path = this.tokens.stringValueAtIndex(index); | ||||
|     const importInfo = this.getImportInfo(path); | ||||
|     importInfo.namedExports.push(...namedImports); | ||||
|   } | ||||
| 
 | ||||
|    preprocessExportStarAtIndex(index) { | ||||
|     let exportedName = null; | ||||
|     if (this.tokens.matches3AtIndex(index, _types.TokenType._export, _types.TokenType.star, _types.TokenType._as)) { | ||||
|       // export * as
 | ||||
|       index += 3; | ||||
|       exportedName = this.tokens.identifierNameAtIndex(index); | ||||
|       // foo from
 | ||||
|       index += 2; | ||||
|     } else { | ||||
|       // export * from
 | ||||
|       index += 3; | ||||
|     } | ||||
|     if (!this.tokens.matches1AtIndex(index, _types.TokenType.string)) { | ||||
|       throw new Error("Expected string token at the end of star export statement."); | ||||
|     } | ||||
|     const path = this.tokens.stringValueAtIndex(index); | ||||
|     const importInfo = this.getImportInfo(path); | ||||
|     if (exportedName !== null) { | ||||
|       importInfo.exportStarNames.push(exportedName); | ||||
|     } else { | ||||
|       importInfo.hasStarExport = true; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    getNamedImports(index) { | ||||
|     const namedImports = []; | ||||
|     while (true) { | ||||
|       if (this.tokens.matches1AtIndex(index, _types.TokenType.braceR)) { | ||||
|         index++; | ||||
|         break; | ||||
|       } | ||||
| 
 | ||||
|       const specifierInfo = _getImportExportSpecifierInfo2.default.call(void 0, this.tokens, index); | ||||
|       index = specifierInfo.endIndex; | ||||
|       if (!specifierInfo.isType) { | ||||
|         namedImports.push({ | ||||
|           importedName: specifierInfo.leftName, | ||||
|           localName: specifierInfo.rightName, | ||||
|         }); | ||||
|       } | ||||
| 
 | ||||
|       if (this.tokens.matches2AtIndex(index, _types.TokenType.comma, _types.TokenType.braceR)) { | ||||
|         index += 2; | ||||
|         break; | ||||
|       } else if (this.tokens.matches1AtIndex(index, _types.TokenType.braceR)) { | ||||
|         index++; | ||||
|         break; | ||||
|       } else if (this.tokens.matches1AtIndex(index, _types.TokenType.comma)) { | ||||
|         index++; | ||||
|       } else { | ||||
|         throw new Error(`Unexpected token: ${JSON.stringify(this.tokens.tokens[index])}`); | ||||
|       } | ||||
|     } | ||||
|     return {newIndex: index, namedImports}; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Get a mutable import info object for this path, creating one if it doesn't | ||||
|    * exist yet. | ||||
|    */ | ||||
|    getImportInfo(path) { | ||||
|     const existingInfo = this.importInfoByPath.get(path); | ||||
|     if (existingInfo) { | ||||
|       return existingInfo; | ||||
|     } | ||||
|     const newInfo = { | ||||
|       defaultNames: [], | ||||
|       wildcardNames: [], | ||||
|       namedImports: [], | ||||
|       namedExports: [], | ||||
|       hasBareImport: false, | ||||
|       exportStarNames: [], | ||||
|       hasStarExport: false, | ||||
|     }; | ||||
|     this.importInfoByPath.set(path, newInfo); | ||||
|     return newInfo; | ||||
|   } | ||||
| 
 | ||||
|    addExportBinding(localName, exportedName) { | ||||
|     if (!this.exportBindingsByLocalName.has(localName)) { | ||||
|       this.exportBindingsByLocalName.set(localName, []); | ||||
|     } | ||||
|     this.exportBindingsByLocalName.get(localName).push(exportedName); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Return the code to use for the import for this path, or the empty string if | ||||
|    * the code has already been "claimed" by a previous import. | ||||
|    */ | ||||
|   claimImportCode(importPath) { | ||||
|     const result = this.importsToReplace.get(importPath); | ||||
|     this.importsToReplace.set(importPath, ""); | ||||
|     return result || ""; | ||||
|   } | ||||
| 
 | ||||
|   getIdentifierReplacement(identifierName) { | ||||
|     return this.identifierReplacements.get(identifierName) || null; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Return a string like `exports.foo = exports.bar`. | ||||
|    */ | ||||
|   resolveExportBinding(assignedName) { | ||||
|     const exportedNames = this.exportBindingsByLocalName.get(assignedName); | ||||
|     if (!exportedNames || exportedNames.length === 0) { | ||||
|       return null; | ||||
|     } | ||||
|     return exportedNames.map((exportedName) => `exports.${exportedName}`).join(" = "); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Return all imported/exported names where we might be interested in whether usages of those | ||||
|    * names are shadowed. | ||||
|    */ | ||||
|   getGlobalNames() { | ||||
|     return new Set([ | ||||
|       ...this.identifierReplacements.keys(), | ||||
|       ...this.exportBindingsByLocalName.keys(), | ||||
|     ]); | ||||
|   } | ||||
| } exports.default = CJSImportProcessor; | ||||
							
								
								
									
										176
									
								
								node_modules/sucrase/dist/HelperManager.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										176
									
								
								node_modules/sucrase/dist/HelperManager.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,176 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); | ||||
| 
 | ||||
| const HELPERS = { | ||||
|   require: ` | ||||
|     import {createRequire as CREATE_REQUIRE_NAME} from "module"; | ||||
|     const require = CREATE_REQUIRE_NAME(import.meta.url); | ||||
|   `,
 | ||||
|   interopRequireWildcard: ` | ||||
|     function interopRequireWildcard(obj) { | ||||
|       if (obj && obj.__esModule) { | ||||
|         return obj; | ||||
|       } else { | ||||
|         var newObj = {}; | ||||
|         if (obj != null) { | ||||
|           for (var key in obj) { | ||||
|             if (Object.prototype.hasOwnProperty.call(obj, key)) { | ||||
|               newObj[key] = obj[key]; | ||||
|             } | ||||
|           } | ||||
|         } | ||||
|         newObj.default = obj; | ||||
|         return newObj; | ||||
|       } | ||||
|     } | ||||
|   `,
 | ||||
|   interopRequireDefault: ` | ||||
|     function interopRequireDefault(obj) { | ||||
|       return obj && obj.__esModule ? obj : { default: obj }; | ||||
|     } | ||||
|   `,
 | ||||
|   createNamedExportFrom: ` | ||||
|     function createNamedExportFrom(obj, localName, importedName) { | ||||
|       Object.defineProperty(exports, localName, {enumerable: true, configurable: true, get: () => obj[importedName]}); | ||||
|     } | ||||
|   `,
 | ||||
|   // Note that TypeScript and Babel do this differently; TypeScript does a simple existence
 | ||||
|   // check in the exports object and does a plain assignment, whereas Babel uses
 | ||||
|   // defineProperty and builds an object of explicitly-exported names so that star exports can
 | ||||
|   // always take lower precedence. For now, we do the easier TypeScript thing.
 | ||||
|   createStarExport: ` | ||||
|     function createStarExport(obj) { | ||||
|       Object.keys(obj) | ||||
|         .filter((key) => key !== "default" && key !== "__esModule") | ||||
|         .forEach((key) => { | ||||
|           if (exports.hasOwnProperty(key)) { | ||||
|             return; | ||||
|           } | ||||
|           Object.defineProperty(exports, key, {enumerable: true, configurable: true, get: () => obj[key]}); | ||||
|         }); | ||||
|     } | ||||
|   `,
 | ||||
|   nullishCoalesce: ` | ||||
|     function nullishCoalesce(lhs, rhsFn) { | ||||
|       if (lhs != null) { | ||||
|         return lhs; | ||||
|       } else { | ||||
|         return rhsFn(); | ||||
|       } | ||||
|     } | ||||
|   `,
 | ||||
|   asyncNullishCoalesce: ` | ||||
|     async function asyncNullishCoalesce(lhs, rhsFn) { | ||||
|       if (lhs != null) { | ||||
|         return lhs; | ||||
|       } else { | ||||
|         return await rhsFn(); | ||||
|       } | ||||
|     } | ||||
|   `,
 | ||||
|   optionalChain: ` | ||||
|     function optionalChain(ops) { | ||||
|       let lastAccessLHS = undefined; | ||||
|       let value = ops[0]; | ||||
|       let i = 1; | ||||
|       while (i < ops.length) { | ||||
|         const op = ops[i]; | ||||
|         const fn = ops[i + 1]; | ||||
|         i += 2; | ||||
|         if ((op === 'optionalAccess' || op === 'optionalCall') && value == null) { | ||||
|           return undefined; | ||||
|         } | ||||
|         if (op === 'access' || op === 'optionalAccess') { | ||||
|           lastAccessLHS = value; | ||||
|           value = fn(value); | ||||
|         } else if (op === 'call' || op === 'optionalCall') { | ||||
|           value = fn((...args) => value.call(lastAccessLHS, ...args)); | ||||
|           lastAccessLHS = undefined; | ||||
|         } | ||||
|       } | ||||
|       return value; | ||||
|     } | ||||
|   `,
 | ||||
|   asyncOptionalChain: ` | ||||
|     async function asyncOptionalChain(ops) { | ||||
|       let lastAccessLHS = undefined; | ||||
|       let value = ops[0]; | ||||
|       let i = 1; | ||||
|       while (i < ops.length) { | ||||
|         const op = ops[i]; | ||||
|         const fn = ops[i + 1]; | ||||
|         i += 2; | ||||
|         if ((op === 'optionalAccess' || op === 'optionalCall') && value == null) { | ||||
|           return undefined; | ||||
|         } | ||||
|         if (op === 'access' || op === 'optionalAccess') { | ||||
|           lastAccessLHS = value; | ||||
|           value = await fn(value); | ||||
|         } else if (op === 'call' || op === 'optionalCall') { | ||||
|           value = await fn((...args) => value.call(lastAccessLHS, ...args)); | ||||
|           lastAccessLHS = undefined; | ||||
|         } | ||||
|       } | ||||
|       return value; | ||||
|     } | ||||
|   `,
 | ||||
|   optionalChainDelete: ` | ||||
|     function optionalChainDelete(ops) { | ||||
|       const result = OPTIONAL_CHAIN_NAME(ops); | ||||
|       return result == null ? true : result; | ||||
|     } | ||||
|   `,
 | ||||
|   asyncOptionalChainDelete: ` | ||||
|     async function asyncOptionalChainDelete(ops) { | ||||
|       const result = await ASYNC_OPTIONAL_CHAIN_NAME(ops); | ||||
|       return result == null ? true : result; | ||||
|     } | ||||
|   `,
 | ||||
| }; | ||||
| 
 | ||||
|  class HelperManager { | ||||
|   __init() {this.helperNames = {}} | ||||
|   __init2() {this.createRequireName = null} | ||||
|   constructor( nameManager) {;this.nameManager = nameManager;HelperManager.prototype.__init.call(this);HelperManager.prototype.__init2.call(this);} | ||||
| 
 | ||||
|   getHelperName(baseName) { | ||||
|     let helperName = this.helperNames[baseName]; | ||||
|     if (helperName) { | ||||
|       return helperName; | ||||
|     } | ||||
|     helperName = this.nameManager.claimFreeName(`_${baseName}`); | ||||
|     this.helperNames[baseName] = helperName; | ||||
|     return helperName; | ||||
|   } | ||||
| 
 | ||||
|   emitHelpers() { | ||||
|     let resultCode = ""; | ||||
|     if (this.helperNames.optionalChainDelete) { | ||||
|       this.getHelperName("optionalChain"); | ||||
|     } | ||||
|     if (this.helperNames.asyncOptionalChainDelete) { | ||||
|       this.getHelperName("asyncOptionalChain"); | ||||
|     } | ||||
|     for (const [baseName, helperCodeTemplate] of Object.entries(HELPERS)) { | ||||
|       const helperName = this.helperNames[baseName]; | ||||
|       let helperCode = helperCodeTemplate; | ||||
|       if (baseName === "optionalChainDelete") { | ||||
|         helperCode = helperCode.replace("OPTIONAL_CHAIN_NAME", this.helperNames.optionalChain); | ||||
|       } else if (baseName === "asyncOptionalChainDelete") { | ||||
|         helperCode = helperCode.replace( | ||||
|           "ASYNC_OPTIONAL_CHAIN_NAME", | ||||
|           this.helperNames.asyncOptionalChain, | ||||
|         ); | ||||
|       } else if (baseName === "require") { | ||||
|         if (this.createRequireName === null) { | ||||
|           this.createRequireName = this.nameManager.claimFreeName("_createRequire"); | ||||
|         } | ||||
|         helperCode = helperCode.replace(/CREATE_REQUIRE_NAME/g, this.createRequireName); | ||||
|       } | ||||
|       if (helperName) { | ||||
|         resultCode += " "; | ||||
|         resultCode += helperCode.replace(baseName, helperName).replace(/\s+/g, " ").trim(); | ||||
|       } | ||||
|     } | ||||
|     return resultCode; | ||||
|   } | ||||
| } exports.HelperManager = HelperManager; | ||||
							
								
								
									
										27
									
								
								node_modules/sucrase/dist/NameManager.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										27
									
								
								node_modules/sucrase/dist/NameManager.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,27 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; } | ||||
| var _getIdentifierNames = require('./util/getIdentifierNames'); var _getIdentifierNames2 = _interopRequireDefault(_getIdentifierNames); | ||||
| 
 | ||||
|  class NameManager { | ||||
|     __init() {this.usedNames = new Set()} | ||||
| 
 | ||||
|   constructor(code, tokens) {;NameManager.prototype.__init.call(this); | ||||
|     this.usedNames = new Set(_getIdentifierNames2.default.call(void 0, code, tokens)); | ||||
|   } | ||||
| 
 | ||||
|   claimFreeName(name) { | ||||
|     const newName = this.findFreeName(name); | ||||
|     this.usedNames.add(newName); | ||||
|     return newName; | ||||
|   } | ||||
| 
 | ||||
|   findFreeName(name) { | ||||
|     if (!this.usedNames.has(name)) { | ||||
|       return name; | ||||
|     } | ||||
|     let suffixNum = 2; | ||||
|     while (this.usedNames.has(name + String(suffixNum))) { | ||||
|       suffixNum++; | ||||
|     } | ||||
|     return name + String(suffixNum); | ||||
|   } | ||||
| } exports.default = NameManager; | ||||
							
								
								
									
										42
									
								
								node_modules/sucrase/dist/Options-gen-types.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										42
									
								
								node_modules/sucrase/dist/Options-gen-types.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,42 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireWildcard(obj) { if (obj && obj.__esModule) { return obj; } else { var newObj = {}; if (obj != null) { for (var key in obj) { if (Object.prototype.hasOwnProperty.call(obj, key)) { newObj[key] = obj[key]; } } } newObj.default = obj; return newObj; } }/** | ||||
|  * This module was automatically generated by `ts-interface-builder` | ||||
|  */ | ||||
| var _tsinterfacechecker = require('ts-interface-checker'); var t = _interopRequireWildcard(_tsinterfacechecker); | ||||
| // tslint:disable:object-literal-key-quotes
 | ||||
| 
 | ||||
|  const Transform = t.union( | ||||
|   t.lit("jsx"), | ||||
|   t.lit("typescript"), | ||||
|   t.lit("flow"), | ||||
|   t.lit("imports"), | ||||
|   t.lit("react-hot-loader"), | ||||
|   t.lit("jest"), | ||||
| ); exports.Transform = Transform; | ||||
| 
 | ||||
|  const SourceMapOptions = t.iface([], { | ||||
|   compiledFilename: "string", | ||||
| }); exports.SourceMapOptions = SourceMapOptions; | ||||
| 
 | ||||
|  const Options = t.iface([], { | ||||
|   transforms: t.array("Transform"), | ||||
|   disableESTransforms: t.opt("boolean"), | ||||
|   jsxRuntime: t.opt(t.union(t.lit("classic"), t.lit("automatic"), t.lit("preserve"))), | ||||
|   production: t.opt("boolean"), | ||||
|   jsxImportSource: t.opt("string"), | ||||
|   jsxPragma: t.opt("string"), | ||||
|   jsxFragmentPragma: t.opt("string"), | ||||
|   keepUnusedImports: t.opt("boolean"), | ||||
|   preserveDynamicImport: t.opt("boolean"), | ||||
|   injectCreateRequireForImportRequire: t.opt("boolean"), | ||||
|   enableLegacyTypeScriptModuleInterop: t.opt("boolean"), | ||||
|   enableLegacyBabel5ModuleInterop: t.opt("boolean"), | ||||
|   sourceMapOptions: t.opt("SourceMapOptions"), | ||||
|   filePath: t.opt("string"), | ||||
| }); exports.Options = Options; | ||||
| 
 | ||||
| const exportedTypeSuite = { | ||||
|   Transform: exports.Transform, | ||||
|   SourceMapOptions: exports.SourceMapOptions, | ||||
|   Options: exports.Options, | ||||
| }; | ||||
| exports. default = exportedTypeSuite; | ||||
							
								
								
									
										101
									
								
								node_modules/sucrase/dist/Options.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										101
									
								
								node_modules/sucrase/dist/Options.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,101 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }var _tsinterfacechecker = require('ts-interface-checker'); | ||||
| 
 | ||||
| var _Optionsgentypes = require('./Options-gen-types'); var _Optionsgentypes2 = _interopRequireDefault(_Optionsgentypes); | ||||
| 
 | ||||
| const {Options: OptionsChecker} = _tsinterfacechecker.createCheckers.call(void 0, _Optionsgentypes2.default); | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
|  function validateOptions(options) { | ||||
|   OptionsChecker.strictCheck(options); | ||||
| } exports.validateOptions = validateOptions; | ||||
							
								
								
									
										357
									
								
								node_modules/sucrase/dist/TokenProcessor.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										357
									
								
								node_modules/sucrase/dist/TokenProcessor.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,357 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; } | ||||
| 
 | ||||
| 
 | ||||
| var _types = require('./parser/tokenizer/types'); | ||||
| var _isAsyncOperation = require('./util/isAsyncOperation'); var _isAsyncOperation2 = _interopRequireDefault(_isAsyncOperation); | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
|  class TokenProcessor { | ||||
|    __init() {this.resultCode = ""} | ||||
|   // Array mapping input token index to optional string index position in the
 | ||||
|   // output code.
 | ||||
|    __init2() {this.resultMappings = new Array(this.tokens.length)} | ||||
|    __init3() {this.tokenIndex = 0} | ||||
| 
 | ||||
|   constructor( | ||||
|      code, | ||||
|      tokens, | ||||
|      isFlowEnabled, | ||||
|      disableESTransforms, | ||||
|      helperManager, | ||||
|   ) {;this.code = code;this.tokens = tokens;this.isFlowEnabled = isFlowEnabled;this.disableESTransforms = disableESTransforms;this.helperManager = helperManager;TokenProcessor.prototype.__init.call(this);TokenProcessor.prototype.__init2.call(this);TokenProcessor.prototype.__init3.call(this);} | ||||
| 
 | ||||
|   /** | ||||
|    * Snapshot the token state in a way that can be restored later, useful for | ||||
|    * things like lookahead. | ||||
|    * | ||||
|    * resultMappings do not need to be copied since in all use cases, they will | ||||
|    * be overwritten anyway after restore. | ||||
|    */ | ||||
|   snapshot() { | ||||
|     return { | ||||
|       resultCode: this.resultCode, | ||||
|       tokenIndex: this.tokenIndex, | ||||
|     }; | ||||
|   } | ||||
| 
 | ||||
|   restoreToSnapshot(snapshot) { | ||||
|     this.resultCode = snapshot.resultCode; | ||||
|     this.tokenIndex = snapshot.tokenIndex; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Remove and return the code generated since the snapshot, leaving the | ||||
|    * current token position in-place. Unlike most TokenProcessor operations, | ||||
|    * this operation can result in input/output line number mismatches because | ||||
|    * the removed code may contain newlines, so this operation should be used | ||||
|    * sparingly. | ||||
|    */ | ||||
|   dangerouslyGetAndRemoveCodeSinceSnapshot(snapshot) { | ||||
|     const result = this.resultCode.slice(snapshot.resultCode.length); | ||||
|     this.resultCode = snapshot.resultCode; | ||||
|     return result; | ||||
|   } | ||||
| 
 | ||||
|   reset() { | ||||
|     this.resultCode = ""; | ||||
|     this.resultMappings = new Array(this.tokens.length); | ||||
|     this.tokenIndex = 0; | ||||
|   } | ||||
| 
 | ||||
|   matchesContextualAtIndex(index, contextualKeyword) { | ||||
|     return ( | ||||
|       this.matches1AtIndex(index, _types.TokenType.name) && | ||||
|       this.tokens[index].contextualKeyword === contextualKeyword | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|   identifierNameAtIndex(index) { | ||||
|     // TODO: We need to process escapes since technically you can have unicode escapes in variable
 | ||||
|     // names.
 | ||||
|     return this.identifierNameForToken(this.tokens[index]); | ||||
|   } | ||||
| 
 | ||||
|   identifierNameAtRelativeIndex(relativeIndex) { | ||||
|     return this.identifierNameForToken(this.tokenAtRelativeIndex(relativeIndex)); | ||||
|   } | ||||
| 
 | ||||
|   identifierName() { | ||||
|     return this.identifierNameForToken(this.currentToken()); | ||||
|   } | ||||
| 
 | ||||
|   identifierNameForToken(token) { | ||||
|     return this.code.slice(token.start, token.end); | ||||
|   } | ||||
| 
 | ||||
|   rawCodeForToken(token) { | ||||
|     return this.code.slice(token.start, token.end); | ||||
|   } | ||||
| 
 | ||||
|   stringValueAtIndex(index) { | ||||
|     return this.stringValueForToken(this.tokens[index]); | ||||
|   } | ||||
| 
 | ||||
|   stringValue() { | ||||
|     return this.stringValueForToken(this.currentToken()); | ||||
|   } | ||||
| 
 | ||||
|   stringValueForToken(token) { | ||||
|     // This is used to identify when two imports are the same and to resolve TypeScript enum keys.
 | ||||
|     // Ideally we'd process escapes within the strings, but for now we pretty much take the raw
 | ||||
|     // code.
 | ||||
|     return this.code.slice(token.start + 1, token.end - 1); | ||||
|   } | ||||
| 
 | ||||
|   matches1AtIndex(index, t1) { | ||||
|     return this.tokens[index].type === t1; | ||||
|   } | ||||
| 
 | ||||
|   matches2AtIndex(index, t1, t2) { | ||||
|     return this.tokens[index].type === t1 && this.tokens[index + 1].type === t2; | ||||
|   } | ||||
| 
 | ||||
|   matches3AtIndex(index, t1, t2, t3) { | ||||
|     return ( | ||||
|       this.tokens[index].type === t1 && | ||||
|       this.tokens[index + 1].type === t2 && | ||||
|       this.tokens[index + 2].type === t3 | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|   matches1(t1) { | ||||
|     return this.tokens[this.tokenIndex].type === t1; | ||||
|   } | ||||
| 
 | ||||
|   matches2(t1, t2) { | ||||
|     return this.tokens[this.tokenIndex].type === t1 && this.tokens[this.tokenIndex + 1].type === t2; | ||||
|   } | ||||
| 
 | ||||
|   matches3(t1, t2, t3) { | ||||
|     return ( | ||||
|       this.tokens[this.tokenIndex].type === t1 && | ||||
|       this.tokens[this.tokenIndex + 1].type === t2 && | ||||
|       this.tokens[this.tokenIndex + 2].type === t3 | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|   matches4(t1, t2, t3, t4) { | ||||
|     return ( | ||||
|       this.tokens[this.tokenIndex].type === t1 && | ||||
|       this.tokens[this.tokenIndex + 1].type === t2 && | ||||
|       this.tokens[this.tokenIndex + 2].type === t3 && | ||||
|       this.tokens[this.tokenIndex + 3].type === t4 | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|   matches5(t1, t2, t3, t4, t5) { | ||||
|     return ( | ||||
|       this.tokens[this.tokenIndex].type === t1 && | ||||
|       this.tokens[this.tokenIndex + 1].type === t2 && | ||||
|       this.tokens[this.tokenIndex + 2].type === t3 && | ||||
|       this.tokens[this.tokenIndex + 3].type === t4 && | ||||
|       this.tokens[this.tokenIndex + 4].type === t5 | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|   matchesContextual(contextualKeyword) { | ||||
|     return this.matchesContextualAtIndex(this.tokenIndex, contextualKeyword); | ||||
|   } | ||||
| 
 | ||||
|   matchesContextIdAndLabel(type, contextId) { | ||||
|     return this.matches1(type) && this.currentToken().contextId === contextId; | ||||
|   } | ||||
| 
 | ||||
|   previousWhitespaceAndComments() { | ||||
|     let whitespaceAndComments = this.code.slice( | ||||
|       this.tokenIndex > 0 ? this.tokens[this.tokenIndex - 1].end : 0, | ||||
|       this.tokenIndex < this.tokens.length ? this.tokens[this.tokenIndex].start : this.code.length, | ||||
|     ); | ||||
|     if (this.isFlowEnabled) { | ||||
|       whitespaceAndComments = whitespaceAndComments.replace(/@flow/g, ""); | ||||
|     } | ||||
|     return whitespaceAndComments; | ||||
|   } | ||||
| 
 | ||||
|   replaceToken(newCode) { | ||||
|     this.resultCode += this.previousWhitespaceAndComments(); | ||||
|     this.appendTokenPrefix(); | ||||
|     this.resultMappings[this.tokenIndex] = this.resultCode.length; | ||||
|     this.resultCode += newCode; | ||||
|     this.appendTokenSuffix(); | ||||
|     this.tokenIndex++; | ||||
|   } | ||||
| 
 | ||||
|   replaceTokenTrimmingLeftWhitespace(newCode) { | ||||
|     this.resultCode += this.previousWhitespaceAndComments().replace(/[^\r\n]/g, ""); | ||||
|     this.appendTokenPrefix(); | ||||
|     this.resultMappings[this.tokenIndex] = this.resultCode.length; | ||||
|     this.resultCode += newCode; | ||||
|     this.appendTokenSuffix(); | ||||
|     this.tokenIndex++; | ||||
|   } | ||||
| 
 | ||||
|   removeInitialToken() { | ||||
|     this.replaceToken(""); | ||||
|   } | ||||
| 
 | ||||
|   removeToken() { | ||||
|     this.replaceTokenTrimmingLeftWhitespace(""); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Remove all code until the next }, accounting for balanced braces. | ||||
|    */ | ||||
|   removeBalancedCode() { | ||||
|     let braceDepth = 0; | ||||
|     while (!this.isAtEnd()) { | ||||
|       if (this.matches1(_types.TokenType.braceL)) { | ||||
|         braceDepth++; | ||||
|       } else if (this.matches1(_types.TokenType.braceR)) { | ||||
|         if (braceDepth === 0) { | ||||
|           return; | ||||
|         } | ||||
|         braceDepth--; | ||||
|       } | ||||
|       this.removeToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   copyExpectedToken(tokenType) { | ||||
|     if (this.tokens[this.tokenIndex].type !== tokenType) { | ||||
|       throw new Error(`Expected token ${tokenType}`); | ||||
|     } | ||||
|     this.copyToken(); | ||||
|   } | ||||
| 
 | ||||
|   copyToken() { | ||||
|     this.resultCode += this.previousWhitespaceAndComments(); | ||||
|     this.appendTokenPrefix(); | ||||
|     this.resultMappings[this.tokenIndex] = this.resultCode.length; | ||||
|     this.resultCode += this.code.slice( | ||||
|       this.tokens[this.tokenIndex].start, | ||||
|       this.tokens[this.tokenIndex].end, | ||||
|     ); | ||||
|     this.appendTokenSuffix(); | ||||
|     this.tokenIndex++; | ||||
|   } | ||||
| 
 | ||||
|   copyTokenWithPrefix(prefix) { | ||||
|     this.resultCode += this.previousWhitespaceAndComments(); | ||||
|     this.appendTokenPrefix(); | ||||
|     this.resultCode += prefix; | ||||
|     this.resultMappings[this.tokenIndex] = this.resultCode.length; | ||||
|     this.resultCode += this.code.slice( | ||||
|       this.tokens[this.tokenIndex].start, | ||||
|       this.tokens[this.tokenIndex].end, | ||||
|     ); | ||||
|     this.appendTokenSuffix(); | ||||
|     this.tokenIndex++; | ||||
|   } | ||||
| 
 | ||||
|    appendTokenPrefix() { | ||||
|     const token = this.currentToken(); | ||||
|     if (token.numNullishCoalesceStarts || token.isOptionalChainStart) { | ||||
|       token.isAsyncOperation = _isAsyncOperation2.default.call(void 0, this); | ||||
|     } | ||||
|     if (this.disableESTransforms) { | ||||
|       return; | ||||
|     } | ||||
|     if (token.numNullishCoalesceStarts) { | ||||
|       for (let i = 0; i < token.numNullishCoalesceStarts; i++) { | ||||
|         if (token.isAsyncOperation) { | ||||
|           this.resultCode += "await "; | ||||
|           this.resultCode += this.helperManager.getHelperName("asyncNullishCoalesce"); | ||||
|         } else { | ||||
|           this.resultCode += this.helperManager.getHelperName("nullishCoalesce"); | ||||
|         } | ||||
|         this.resultCode += "("; | ||||
|       } | ||||
|     } | ||||
|     if (token.isOptionalChainStart) { | ||||
|       if (token.isAsyncOperation) { | ||||
|         this.resultCode += "await "; | ||||
|       } | ||||
|       if (this.tokenIndex > 0 && this.tokenAtRelativeIndex(-1).type === _types.TokenType._delete) { | ||||
|         if (token.isAsyncOperation) { | ||||
|           this.resultCode += this.helperManager.getHelperName("asyncOptionalChainDelete"); | ||||
|         } else { | ||||
|           this.resultCode += this.helperManager.getHelperName("optionalChainDelete"); | ||||
|         } | ||||
|       } else if (token.isAsyncOperation) { | ||||
|         this.resultCode += this.helperManager.getHelperName("asyncOptionalChain"); | ||||
|       } else { | ||||
|         this.resultCode += this.helperManager.getHelperName("optionalChain"); | ||||
|       } | ||||
|       this.resultCode += "(["; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    appendTokenSuffix() { | ||||
|     const token = this.currentToken(); | ||||
|     if (token.isOptionalChainEnd && !this.disableESTransforms) { | ||||
|       this.resultCode += "])"; | ||||
|     } | ||||
|     if (token.numNullishCoalesceEnds && !this.disableESTransforms) { | ||||
|       for (let i = 0; i < token.numNullishCoalesceEnds; i++) { | ||||
|         this.resultCode += "))"; | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   appendCode(code) { | ||||
|     this.resultCode += code; | ||||
|   } | ||||
| 
 | ||||
|   currentToken() { | ||||
|     return this.tokens[this.tokenIndex]; | ||||
|   } | ||||
| 
 | ||||
|   currentTokenCode() { | ||||
|     const token = this.currentToken(); | ||||
|     return this.code.slice(token.start, token.end); | ||||
|   } | ||||
| 
 | ||||
|   tokenAtRelativeIndex(relativeIndex) { | ||||
|     return this.tokens[this.tokenIndex + relativeIndex]; | ||||
|   } | ||||
| 
 | ||||
|   currentIndex() { | ||||
|     return this.tokenIndex; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Move to the next token. Only suitable in preprocessing steps. When | ||||
|    * generating new code, you should use copyToken or removeToken. | ||||
|    */ | ||||
|   nextToken() { | ||||
|     if (this.tokenIndex === this.tokens.length) { | ||||
|       throw new Error("Unexpectedly reached end of input."); | ||||
|     } | ||||
|     this.tokenIndex++; | ||||
|   } | ||||
| 
 | ||||
|   previousToken() { | ||||
|     this.tokenIndex--; | ||||
|   } | ||||
| 
 | ||||
|   finish() { | ||||
|     if (this.tokenIndex !== this.tokens.length) { | ||||
|       throw new Error("Tried to finish processing tokens before reaching the end."); | ||||
|     } | ||||
|     this.resultCode += this.previousWhitespaceAndComments(); | ||||
|     return {code: this.resultCode, mappings: this.resultMappings}; | ||||
|   } | ||||
| 
 | ||||
|   isAtEnd() { | ||||
|     return this.tokenIndex === this.tokens.length; | ||||
|   } | ||||
| } exports.default = TokenProcessor; | ||||
							
								
								
									
										320
									
								
								node_modules/sucrase/dist/cli.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										320
									
								
								node_modules/sucrase/dist/cli.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,320 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }/* eslint-disable no-console */ | ||||
| var _commander = require('commander'); var _commander2 = _interopRequireDefault(_commander); | ||||
| var _glob = require('glob'); var _glob2 = _interopRequireDefault(_glob); | ||||
| var _fs = require('mz/fs'); | ||||
| var _path = require('path'); | ||||
| var _util = require('util'); | ||||
| 
 | ||||
| var _index = require('./index'); | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| const glob = _util.promisify.call(void 0, _glob2.default); | ||||
| 
 | ||||
|  function run() { | ||||
|   _commander2.default | ||||
|     .description(`Sucrase: super-fast Babel alternative.`) | ||||
|     .usage("[options] <srcDir>") | ||||
|     .option( | ||||
|       "-d, --out-dir <out>", | ||||
|       "Compile an input directory of modules into an output directory.", | ||||
|     ) | ||||
|     .option( | ||||
|       "-p, --project <dir>", | ||||
|       "Compile a TypeScript project, will read from tsconfig.json in <dir>", | ||||
|     ) | ||||
|     .option("--out-extension <extension>", "File extension to use for all output files.", "js") | ||||
|     .option("--exclude-dirs <paths>", "Names of directories that should not be traversed.") | ||||
|     .option("-q, --quiet", "Don't print the names of converted files.") | ||||
|     .option("-t, --transforms <transforms>", "Comma-separated list of transforms to run.") | ||||
|     .option("--disable-es-transforms", "Opt out of all ES syntax transforms.") | ||||
|     .option("--jsx-runtime <string>", "Transformation mode for the JSX transform.") | ||||
|     .option("--production", "Disable debugging information from JSX in output.") | ||||
|     .option( | ||||
|       "--jsx-import-source <string>", | ||||
|       "Automatic JSX transform import path prefix, defaults to `React.Fragment`.", | ||||
|     ) | ||||
|     .option( | ||||
|       "--jsx-pragma <string>", | ||||
|       "Classic JSX transform element creation function, defaults to `React.createElement`.", | ||||
|     ) | ||||
|     .option( | ||||
|       "--jsx-fragment-pragma <string>", | ||||
|       "Classic JSX transform fragment component, defaults to `React.Fragment`.", | ||||
|     ) | ||||
|     .option("--keep-unused-imports", "Disable automatic removal of type-only imports/exports.") | ||||
|     .option("--preserve-dynamic-import", "Don't transpile dynamic import() to require.") | ||||
|     .option( | ||||
|       "--inject-create-require-for-import-require", | ||||
|       "Use `createRequire` when transpiling TS `import = require` to ESM.", | ||||
|     ) | ||||
|     .option( | ||||
|       "--enable-legacy-typescript-module-interop", | ||||
|       "Use default TypeScript ESM/CJS interop strategy.", | ||||
|     ) | ||||
|     .option("--enable-legacy-babel5-module-interop", "Use Babel 5 ESM/CJS interop strategy.") | ||||
|     .parse(process.argv); | ||||
| 
 | ||||
|   if (_commander2.default.project) { | ||||
|     if ( | ||||
|       _commander2.default.outDir || | ||||
|       _commander2.default.transforms || | ||||
|       _commander2.default.args[0] || | ||||
|       _commander2.default.enableLegacyTypescriptModuleInterop | ||||
|     ) { | ||||
|       console.error( | ||||
|         "If TypeScript project is specified, out directory, transforms, source " + | ||||
|           "directory, and --enable-legacy-typescript-module-interop may not be specified.", | ||||
|       ); | ||||
|       process.exit(1); | ||||
|     } | ||||
|   } else { | ||||
|     if (!_commander2.default.outDir) { | ||||
|       console.error("Out directory is required"); | ||||
|       process.exit(1); | ||||
|     } | ||||
| 
 | ||||
|     if (!_commander2.default.transforms) { | ||||
|       console.error("Transforms option is required."); | ||||
|       process.exit(1); | ||||
|     } | ||||
| 
 | ||||
|     if (!_commander2.default.args[0]) { | ||||
|       console.error("Source directory is required."); | ||||
|       process.exit(1); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   const options = { | ||||
|     outDirPath: _commander2.default.outDir, | ||||
|     srcDirPath: _commander2.default.args[0], | ||||
|     project: _commander2.default.project, | ||||
|     outExtension: _commander2.default.outExtension, | ||||
|     excludeDirs: _commander2.default.excludeDirs ? _commander2.default.excludeDirs.split(",") : [], | ||||
|     quiet: _commander2.default.quiet, | ||||
|     sucraseOptions: { | ||||
|       transforms: _commander2.default.transforms ? _commander2.default.transforms.split(",") : [], | ||||
|       disableESTransforms: _commander2.default.disableEsTransforms, | ||||
|       jsxRuntime: _commander2.default.jsxRuntime, | ||||
|       production: _commander2.default.production, | ||||
|       jsxImportSource: _commander2.default.jsxImportSource, | ||||
|       jsxPragma: _commander2.default.jsxPragma || "React.createElement", | ||||
|       jsxFragmentPragma: _commander2.default.jsxFragmentPragma || "React.Fragment", | ||||
|       keepUnusedImports: _commander2.default.keepUnusedImports, | ||||
|       preserveDynamicImport: _commander2.default.preserveDynamicImport, | ||||
|       injectCreateRequireForImportRequire: _commander2.default.injectCreateRequireForImportRequire, | ||||
|       enableLegacyTypeScriptModuleInterop: _commander2.default.enableLegacyTypescriptModuleInterop, | ||||
|       enableLegacyBabel5ModuleInterop: _commander2.default.enableLegacyBabel5ModuleInterop, | ||||
|     }, | ||||
|   }; | ||||
| 
 | ||||
|   buildDirectory(options).catch((e) => { | ||||
|     process.exitCode = 1; | ||||
|     console.error(e); | ||||
|   }); | ||||
| } exports.default = run; | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| async function findFiles(options) { | ||||
|   const outDirPath = options.outDirPath; | ||||
|   const srcDirPath = options.srcDirPath; | ||||
| 
 | ||||
|   const extensions = options.sucraseOptions.transforms.includes("typescript") | ||||
|     ? [".ts", ".tsx"] | ||||
|     : [".js", ".jsx"]; | ||||
| 
 | ||||
|   if (!(await _fs.exists.call(void 0, outDirPath))) { | ||||
|     await _fs.mkdir.call(void 0, outDirPath); | ||||
|   } | ||||
| 
 | ||||
|   const outArr = []; | ||||
|   for (const child of await _fs.readdir.call(void 0, srcDirPath)) { | ||||
|     if (["node_modules", ".git"].includes(child) || options.excludeDirs.includes(child)) { | ||||
|       continue; | ||||
|     } | ||||
|     const srcChildPath = _path.join.call(void 0, srcDirPath, child); | ||||
|     const outChildPath = _path.join.call(void 0, outDirPath, child); | ||||
|     if ((await _fs.stat.call(void 0, srcChildPath)).isDirectory()) { | ||||
|       const innerOptions = {...options}; | ||||
|       innerOptions.srcDirPath = srcChildPath; | ||||
|       innerOptions.outDirPath = outChildPath; | ||||
|       const innerFiles = await findFiles(innerOptions); | ||||
|       outArr.push(...innerFiles); | ||||
|     } else if (extensions.some((ext) => srcChildPath.endsWith(ext))) { | ||||
|       const outPath = outChildPath.replace(/\.\w+$/, `.${options.outExtension}`); | ||||
|       outArr.push({ | ||||
|         srcPath: srcChildPath, | ||||
|         outPath, | ||||
|       }); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   return outArr; | ||||
| } | ||||
| 
 | ||||
| async function runGlob(options) { | ||||
|   const tsConfigPath = _path.join.call(void 0, options.project, "tsconfig.json"); | ||||
| 
 | ||||
|   let str; | ||||
|   try { | ||||
|     str = await _fs.readFile.call(void 0, tsConfigPath, "utf8"); | ||||
|   } catch (err) { | ||||
|     console.error("Could not find project tsconfig.json"); | ||||
|     console.error(`  --project=${options.project}`); | ||||
|     console.error(err); | ||||
|     process.exit(1); | ||||
|   } | ||||
|   const json = JSON.parse(str); | ||||
| 
 | ||||
|   const foundFiles = []; | ||||
| 
 | ||||
|   const files = json.files; | ||||
|   const include = json.include; | ||||
| 
 | ||||
|   const absProject = _path.join.call(void 0, process.cwd(), options.project); | ||||
|   const outDirs = []; | ||||
| 
 | ||||
|   if (!(await _fs.exists.call(void 0, options.outDirPath))) { | ||||
|     await _fs.mkdir.call(void 0, options.outDirPath); | ||||
|   } | ||||
| 
 | ||||
|   if (files) { | ||||
|     for (const file of files) { | ||||
|       if (file.endsWith(".d.ts")) { | ||||
|         continue; | ||||
|       } | ||||
|       if (!file.endsWith(".ts") && !file.endsWith(".js")) { | ||||
|         continue; | ||||
|       } | ||||
| 
 | ||||
|       const srcFile = _path.join.call(void 0, absProject, file); | ||||
|       const outFile = _path.join.call(void 0, options.outDirPath, file); | ||||
|       const outPath = outFile.replace(/\.\w+$/, `.${options.outExtension}`); | ||||
| 
 | ||||
|       const outDir = _path.dirname.call(void 0, outPath); | ||||
|       if (!outDirs.includes(outDir)) { | ||||
|         outDirs.push(outDir); | ||||
|       } | ||||
| 
 | ||||
|       foundFiles.push({ | ||||
|         srcPath: srcFile, | ||||
|         outPath, | ||||
|       }); | ||||
|     } | ||||
|   } | ||||
|   if (include) { | ||||
|     for (const pattern of include) { | ||||
|       const globFiles = await glob(_path.join.call(void 0, absProject, pattern)); | ||||
|       for (const file of globFiles) { | ||||
|         if (!file.endsWith(".ts") && !file.endsWith(".js")) { | ||||
|           continue; | ||||
|         } | ||||
|         if (file.endsWith(".d.ts")) { | ||||
|           continue; | ||||
|         } | ||||
| 
 | ||||
|         const relativeFile = _path.relative.call(void 0, absProject, file); | ||||
|         const outFile = _path.join.call(void 0, options.outDirPath, relativeFile); | ||||
|         const outPath = outFile.replace(/\.\w+$/, `.${options.outExtension}`); | ||||
| 
 | ||||
|         const outDir = _path.dirname.call(void 0, outPath); | ||||
|         if (!outDirs.includes(outDir)) { | ||||
|           outDirs.push(outDir); | ||||
|         } | ||||
| 
 | ||||
|         foundFiles.push({ | ||||
|           srcPath: file, | ||||
|           outPath, | ||||
|         }); | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   for (const outDirPath of outDirs) { | ||||
|     if (!(await _fs.exists.call(void 0, outDirPath))) { | ||||
|       await _fs.mkdir.call(void 0, outDirPath); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   // TODO: read exclude
 | ||||
| 
 | ||||
|   return foundFiles; | ||||
| } | ||||
| 
 | ||||
| async function updateOptionsFromProject(options) { | ||||
|   /** | ||||
|    * Read the project information and assign the following. | ||||
|    *  - outDirPath | ||||
|    *  - transform: imports | ||||
|    *  - transform: typescript | ||||
|    *  - enableLegacyTypescriptModuleInterop: true/false. | ||||
|    */ | ||||
| 
 | ||||
|   const tsConfigPath = _path.join.call(void 0, options.project, "tsconfig.json"); | ||||
| 
 | ||||
|   let str; | ||||
|   try { | ||||
|     str = await _fs.readFile.call(void 0, tsConfigPath, "utf8"); | ||||
|   } catch (err) { | ||||
|     console.error("Could not find project tsconfig.json"); | ||||
|     console.error(`  --project=${options.project}`); | ||||
|     console.error(err); | ||||
|     process.exit(1); | ||||
|   } | ||||
|   const json = JSON.parse(str); | ||||
|   const sucraseOpts = options.sucraseOptions; | ||||
|   if (!sucraseOpts.transforms.includes("typescript")) { | ||||
|     sucraseOpts.transforms.push("typescript"); | ||||
|   } | ||||
| 
 | ||||
|   const compilerOpts = json.compilerOptions; | ||||
|   if (compilerOpts.outDir) { | ||||
|     options.outDirPath = _path.join.call(void 0, process.cwd(), options.project, compilerOpts.outDir); | ||||
|   } | ||||
|   if (compilerOpts.esModuleInterop !== true) { | ||||
|     sucraseOpts.enableLegacyTypeScriptModuleInterop = true; | ||||
|   } | ||||
|   if (compilerOpts.module === "commonjs") { | ||||
|     if (!sucraseOpts.transforms.includes("imports")) { | ||||
|       sucraseOpts.transforms.push("imports"); | ||||
|     } | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| async function buildDirectory(options) { | ||||
|   let files; | ||||
|   if (options.outDirPath && options.srcDirPath) { | ||||
|     files = await findFiles(options); | ||||
|   } else if (options.project) { | ||||
|     await updateOptionsFromProject(options); | ||||
|     files = await runGlob(options); | ||||
|   } else { | ||||
|     console.error("Project or Source directory required."); | ||||
|     process.exit(1); | ||||
|   } | ||||
| 
 | ||||
|   for (const file of files) { | ||||
|     await buildFile(file.srcPath, file.outPath, options); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| async function buildFile(srcPath, outPath, options) { | ||||
|   if (!options.quiet) { | ||||
|     console.log(`${srcPath} -> ${outPath}`); | ||||
|   } | ||||
|   const code = (await _fs.readFile.call(void 0, srcPath)).toString(); | ||||
|   const transformedCode = _index.transform.call(void 0, code, {...options.sucraseOptions, filePath: srcPath}).code; | ||||
|   await _fs.writeFile.call(void 0, outPath, transformedCode); | ||||
| } | ||||
							
								
								
									
										89
									
								
								node_modules/sucrase/dist/computeSourceMap.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										89
									
								
								node_modules/sucrase/dist/computeSourceMap.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,89 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true});var _genmapping = require('@jridgewell/gen-mapping'); | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| var _charcodes = require('./parser/util/charcodes'); | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| /** | ||||
|  * Generate a source map indicating that each line maps directly to the original line, | ||||
|  * with the tokens in their new positions. | ||||
|  */ | ||||
|  function computeSourceMap( | ||||
|   {code: generatedCode, mappings: rawMappings}, | ||||
|   filePath, | ||||
|   options, | ||||
|   source, | ||||
|   tokens, | ||||
| ) { | ||||
|   const sourceColumns = computeSourceColumns(source, tokens); | ||||
|   const map = new (0, _genmapping.GenMapping)({file: options.compiledFilename}); | ||||
|   let tokenIndex = 0; | ||||
|   // currentMapping is the output source index for the current input token being
 | ||||
|   // considered.
 | ||||
|   let currentMapping = rawMappings[0]; | ||||
|   while (currentMapping === undefined && tokenIndex < rawMappings.length - 1) { | ||||
|     tokenIndex++; | ||||
|     currentMapping = rawMappings[tokenIndex]; | ||||
|   } | ||||
|   let line = 0; | ||||
|   let lineStart = 0; | ||||
|   if (currentMapping !== lineStart) { | ||||
|     _genmapping.maybeAddSegment.call(void 0, map, line, 0, filePath, line, 0); | ||||
|   } | ||||
|   for (let i = 0; i < generatedCode.length; i++) { | ||||
|     if (i === currentMapping) { | ||||
|       const genColumn = currentMapping - lineStart; | ||||
|       const sourceColumn = sourceColumns[tokenIndex]; | ||||
|       _genmapping.maybeAddSegment.call(void 0, map, line, genColumn, filePath, line, sourceColumn); | ||||
|       while ( | ||||
|         (currentMapping === i || currentMapping === undefined) && | ||||
|         tokenIndex < rawMappings.length - 1 | ||||
|       ) { | ||||
|         tokenIndex++; | ||||
|         currentMapping = rawMappings[tokenIndex]; | ||||
|       } | ||||
|     } | ||||
|     if (generatedCode.charCodeAt(i) === _charcodes.charCodes.lineFeed) { | ||||
|       line++; | ||||
|       lineStart = i + 1; | ||||
|       if (currentMapping !== lineStart) { | ||||
|         _genmapping.maybeAddSegment.call(void 0, map, line, 0, filePath, line, 0); | ||||
|       } | ||||
|     } | ||||
|   } | ||||
|   const {sourceRoot, sourcesContent, ...sourceMap} = _genmapping.toEncodedMap.call(void 0, map); | ||||
|   return sourceMap ; | ||||
| } exports.default = computeSourceMap; | ||||
| 
 | ||||
| /** | ||||
|  * Create an array mapping each token index to the 0-based column of the start | ||||
|  * position of the token. | ||||
|  */ | ||||
| function computeSourceColumns(code, tokens) { | ||||
|   const sourceColumns = new Array(tokens.length); | ||||
|   let tokenIndex = 0; | ||||
|   let currentMapping = tokens[tokenIndex].start; | ||||
|   let lineStart = 0; | ||||
|   for (let i = 0; i < code.length; i++) { | ||||
|     if (i === currentMapping) { | ||||
|       sourceColumns[tokenIndex] = currentMapping - lineStart; | ||||
|       tokenIndex++; | ||||
|       currentMapping = tokens[tokenIndex].start; | ||||
|     } | ||||
|     if (code.charCodeAt(i) === _charcodes.charCodes.lineFeed) { | ||||
|       lineStart = i + 1; | ||||
|     } | ||||
|   } | ||||
|   return sourceColumns; | ||||
| } | ||||
							
								
								
									
										456
									
								
								node_modules/sucrase/dist/esm/CJSImportProcessor.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										456
									
								
								node_modules/sucrase/dist/esm/CJSImportProcessor.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,456 @@ | |||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| import {isDeclaration} from "./parser/tokenizer"; | ||||
| import {ContextualKeyword} from "./parser/tokenizer/keywords"; | ||||
| import {TokenType as tt} from "./parser/tokenizer/types"; | ||||
| 
 | ||||
| import getImportExportSpecifierInfo from "./util/getImportExportSpecifierInfo"; | ||||
| import {getNonTypeIdentifiers} from "./util/getNonTypeIdentifiers"; | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| /** | ||||
|  * Class responsible for preprocessing and bookkeeping import and export declarations within the | ||||
|  * file. | ||||
|  * | ||||
|  * TypeScript uses a simpler mechanism that does not use functions like interopRequireDefault and | ||||
|  * interopRequireWildcard, so we also allow that mode for compatibility. | ||||
|  */ | ||||
| export default class CJSImportProcessor { | ||||
|    __init() {this.nonTypeIdentifiers = new Set()} | ||||
|    __init2() {this.importInfoByPath = new Map()} | ||||
|    __init3() {this.importsToReplace = new Map()} | ||||
|    __init4() {this.identifierReplacements = new Map()} | ||||
|    __init5() {this.exportBindingsByLocalName = new Map()} | ||||
| 
 | ||||
|   constructor( | ||||
|      nameManager, | ||||
|      tokens, | ||||
|      enableLegacyTypeScriptModuleInterop, | ||||
|      options, | ||||
|      isTypeScriptTransformEnabled, | ||||
|      keepUnusedImports, | ||||
|      helperManager, | ||||
|   ) {;this.nameManager = nameManager;this.tokens = tokens;this.enableLegacyTypeScriptModuleInterop = enableLegacyTypeScriptModuleInterop;this.options = options;this.isTypeScriptTransformEnabled = isTypeScriptTransformEnabled;this.keepUnusedImports = keepUnusedImports;this.helperManager = helperManager;CJSImportProcessor.prototype.__init.call(this);CJSImportProcessor.prototype.__init2.call(this);CJSImportProcessor.prototype.__init3.call(this);CJSImportProcessor.prototype.__init4.call(this);CJSImportProcessor.prototype.__init5.call(this);} | ||||
| 
 | ||||
|   preprocessTokens() { | ||||
|     for (let i = 0; i < this.tokens.tokens.length; i++) { | ||||
|       if ( | ||||
|         this.tokens.matches1AtIndex(i, tt._import) && | ||||
|         !this.tokens.matches3AtIndex(i, tt._import, tt.name, tt.eq) | ||||
|       ) { | ||||
|         this.preprocessImportAtIndex(i); | ||||
|       } | ||||
|       if ( | ||||
|         this.tokens.matches1AtIndex(i, tt._export) && | ||||
|         !this.tokens.matches2AtIndex(i, tt._export, tt.eq) | ||||
|       ) { | ||||
|         this.preprocessExportAtIndex(i); | ||||
|       } | ||||
|     } | ||||
|     this.generateImportReplacements(); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * In TypeScript, import statements that only import types should be removed. | ||||
|    * This includes `import {} from 'foo';`, but not `import 'foo';`. | ||||
|    */ | ||||
|   pruneTypeOnlyImports() { | ||||
|     this.nonTypeIdentifiers = getNonTypeIdentifiers(this.tokens, this.options); | ||||
|     for (const [path, importInfo] of this.importInfoByPath.entries()) { | ||||
|       if ( | ||||
|         importInfo.hasBareImport || | ||||
|         importInfo.hasStarExport || | ||||
|         importInfo.exportStarNames.length > 0 || | ||||
|         importInfo.namedExports.length > 0 | ||||
|       ) { | ||||
|         continue; | ||||
|       } | ||||
|       const names = [ | ||||
|         ...importInfo.defaultNames, | ||||
|         ...importInfo.wildcardNames, | ||||
|         ...importInfo.namedImports.map(({localName}) => localName), | ||||
|       ]; | ||||
|       if (names.every((name) => this.shouldAutomaticallyElideImportedName(name))) { | ||||
|         this.importsToReplace.set(path, ""); | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   shouldAutomaticallyElideImportedName(name) { | ||||
|     return ( | ||||
|       this.isTypeScriptTransformEnabled && | ||||
|       !this.keepUnusedImports && | ||||
|       !this.nonTypeIdentifiers.has(name) | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|    generateImportReplacements() { | ||||
|     for (const [path, importInfo] of this.importInfoByPath.entries()) { | ||||
|       const { | ||||
|         defaultNames, | ||||
|         wildcardNames, | ||||
|         namedImports, | ||||
|         namedExports, | ||||
|         exportStarNames, | ||||
|         hasStarExport, | ||||
|       } = importInfo; | ||||
| 
 | ||||
|       if ( | ||||
|         defaultNames.length === 0 && | ||||
|         wildcardNames.length === 0 && | ||||
|         namedImports.length === 0 && | ||||
|         namedExports.length === 0 && | ||||
|         exportStarNames.length === 0 && | ||||
|         !hasStarExport | ||||
|       ) { | ||||
|         // Import is never used, so don't even assign a name.
 | ||||
|         this.importsToReplace.set(path, `require('${path}');`); | ||||
|         continue; | ||||
|       } | ||||
| 
 | ||||
|       const primaryImportName = this.getFreeIdentifierForPath(path); | ||||
|       let secondaryImportName; | ||||
|       if (this.enableLegacyTypeScriptModuleInterop) { | ||||
|         secondaryImportName = primaryImportName; | ||||
|       } else { | ||||
|         secondaryImportName = | ||||
|           wildcardNames.length > 0 ? wildcardNames[0] : this.getFreeIdentifierForPath(path); | ||||
|       } | ||||
|       let requireCode = `var ${primaryImportName} = require('${path}');`; | ||||
|       if (wildcardNames.length > 0) { | ||||
|         for (const wildcardName of wildcardNames) { | ||||
|           const moduleExpr = this.enableLegacyTypeScriptModuleInterop | ||||
|             ? primaryImportName | ||||
|             : `${this.helperManager.getHelperName("interopRequireWildcard")}(${primaryImportName})`; | ||||
|           requireCode += ` var ${wildcardName} = ${moduleExpr};`; | ||||
|         } | ||||
|       } else if (exportStarNames.length > 0 && secondaryImportName !== primaryImportName) { | ||||
|         requireCode += ` var ${secondaryImportName} = ${this.helperManager.getHelperName( | ||||
|           "interopRequireWildcard", | ||||
|         )}(${primaryImportName});`;
 | ||||
|       } else if (defaultNames.length > 0 && secondaryImportName !== primaryImportName) { | ||||
|         requireCode += ` var ${secondaryImportName} = ${this.helperManager.getHelperName( | ||||
|           "interopRequireDefault", | ||||
|         )}(${primaryImportName});`;
 | ||||
|       } | ||||
| 
 | ||||
|       for (const {importedName, localName} of namedExports) { | ||||
|         requireCode += ` ${this.helperManager.getHelperName( | ||||
|           "createNamedExportFrom", | ||||
|         )}(${primaryImportName}, '${localName}', '${importedName}');`;
 | ||||
|       } | ||||
|       for (const exportStarName of exportStarNames) { | ||||
|         requireCode += ` exports.${exportStarName} = ${secondaryImportName};`; | ||||
|       } | ||||
|       if (hasStarExport) { | ||||
|         requireCode += ` ${this.helperManager.getHelperName( | ||||
|           "createStarExport", | ||||
|         )}(${primaryImportName});`;
 | ||||
|       } | ||||
| 
 | ||||
|       this.importsToReplace.set(path, requireCode); | ||||
| 
 | ||||
|       for (const defaultName of defaultNames) { | ||||
|         this.identifierReplacements.set(defaultName, `${secondaryImportName}.default`); | ||||
|       } | ||||
|       for (const {importedName, localName} of namedImports) { | ||||
|         this.identifierReplacements.set(localName, `${primaryImportName}.${importedName}`); | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   getFreeIdentifierForPath(path) { | ||||
|     const components = path.split("/"); | ||||
|     const lastComponent = components[components.length - 1]; | ||||
|     const baseName = lastComponent.replace(/\W/g, ""); | ||||
|     return this.nameManager.claimFreeName(`_${baseName}`); | ||||
|   } | ||||
| 
 | ||||
|    preprocessImportAtIndex(index) { | ||||
|     const defaultNames = []; | ||||
|     const wildcardNames = []; | ||||
|     const namedImports = []; | ||||
| 
 | ||||
|     index++; | ||||
|     if ( | ||||
|       (this.tokens.matchesContextualAtIndex(index, ContextualKeyword._type) || | ||||
|         this.tokens.matches1AtIndex(index, tt._typeof)) && | ||||
|       !this.tokens.matches1AtIndex(index + 1, tt.comma) && | ||||
|       !this.tokens.matchesContextualAtIndex(index + 1, ContextualKeyword._from) | ||||
|     ) { | ||||
|       // import type declaration, so no need to process anything.
 | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1AtIndex(index, tt.parenL)) { | ||||
|       // Dynamic import, so nothing to do
 | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1AtIndex(index, tt.name)) { | ||||
|       defaultNames.push(this.tokens.identifierNameAtIndex(index)); | ||||
|       index++; | ||||
|       if (this.tokens.matches1AtIndex(index, tt.comma)) { | ||||
|         index++; | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1AtIndex(index, tt.star)) { | ||||
|       // * as
 | ||||
|       index += 2; | ||||
|       wildcardNames.push(this.tokens.identifierNameAtIndex(index)); | ||||
|       index++; | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1AtIndex(index, tt.braceL)) { | ||||
|       const result = this.getNamedImports(index + 1); | ||||
|       index = result.newIndex; | ||||
| 
 | ||||
|       for (const namedImport of result.namedImports) { | ||||
|         // Treat {default as X} as a default import to ensure usage of require interop helper
 | ||||
|         if (namedImport.importedName === "default") { | ||||
|           defaultNames.push(namedImport.localName); | ||||
|         } else { | ||||
|           namedImports.push(namedImport); | ||||
|         } | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matchesContextualAtIndex(index, ContextualKeyword._from)) { | ||||
|       index++; | ||||
|     } | ||||
| 
 | ||||
|     if (!this.tokens.matches1AtIndex(index, tt.string)) { | ||||
|       throw new Error("Expected string token at the end of import statement."); | ||||
|     } | ||||
|     const path = this.tokens.stringValueAtIndex(index); | ||||
|     const importInfo = this.getImportInfo(path); | ||||
|     importInfo.defaultNames.push(...defaultNames); | ||||
|     importInfo.wildcardNames.push(...wildcardNames); | ||||
|     importInfo.namedImports.push(...namedImports); | ||||
|     if (defaultNames.length === 0 && wildcardNames.length === 0 && namedImports.length === 0) { | ||||
|       importInfo.hasBareImport = true; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    preprocessExportAtIndex(index) { | ||||
|     if ( | ||||
|       this.tokens.matches2AtIndex(index, tt._export, tt._var) || | ||||
|       this.tokens.matches2AtIndex(index, tt._export, tt._let) || | ||||
|       this.tokens.matches2AtIndex(index, tt._export, tt._const) | ||||
|     ) { | ||||
|       this.preprocessVarExportAtIndex(index); | ||||
|     } else if ( | ||||
|       this.tokens.matches2AtIndex(index, tt._export, tt._function) || | ||||
|       this.tokens.matches2AtIndex(index, tt._export, tt._class) | ||||
|     ) { | ||||
|       const exportName = this.tokens.identifierNameAtIndex(index + 2); | ||||
|       this.addExportBinding(exportName, exportName); | ||||
|     } else if (this.tokens.matches3AtIndex(index, tt._export, tt.name, tt._function)) { | ||||
|       const exportName = this.tokens.identifierNameAtIndex(index + 3); | ||||
|       this.addExportBinding(exportName, exportName); | ||||
|     } else if (this.tokens.matches2AtIndex(index, tt._export, tt.braceL)) { | ||||
|       this.preprocessNamedExportAtIndex(index); | ||||
|     } else if (this.tokens.matches2AtIndex(index, tt._export, tt.star)) { | ||||
|       this.preprocessExportStarAtIndex(index); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    preprocessVarExportAtIndex(index) { | ||||
|     let depth = 0; | ||||
|     // Handle cases like `export let {x} = y;`, starting at the open-brace in that case.
 | ||||
|     for (let i = index + 2; ; i++) { | ||||
|       if ( | ||||
|         this.tokens.matches1AtIndex(i, tt.braceL) || | ||||
|         this.tokens.matches1AtIndex(i, tt.dollarBraceL) || | ||||
|         this.tokens.matches1AtIndex(i, tt.bracketL) | ||||
|       ) { | ||||
|         depth++; | ||||
|       } else if ( | ||||
|         this.tokens.matches1AtIndex(i, tt.braceR) || | ||||
|         this.tokens.matches1AtIndex(i, tt.bracketR) | ||||
|       ) { | ||||
|         depth--; | ||||
|       } else if (depth === 0 && !this.tokens.matches1AtIndex(i, tt.name)) { | ||||
|         break; | ||||
|       } else if (this.tokens.matches1AtIndex(1, tt.eq)) { | ||||
|         const endIndex = this.tokens.currentToken().rhsEndIndex; | ||||
|         if (endIndex == null) { | ||||
|           throw new Error("Expected = token with an end index."); | ||||
|         } | ||||
|         i = endIndex - 1; | ||||
|       } else { | ||||
|         const token = this.tokens.tokens[i]; | ||||
|         if (isDeclaration(token)) { | ||||
|           const exportName = this.tokens.identifierNameAtIndex(i); | ||||
|           this.identifierReplacements.set(exportName, `exports.${exportName}`); | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Walk this export statement just in case it's an export...from statement. | ||||
|    * If it is, combine it into the import info for that path. Otherwise, just | ||||
|    * bail out; it'll be handled later. | ||||
|    */ | ||||
|    preprocessNamedExportAtIndex(index) { | ||||
|     // export {
 | ||||
|     index += 2; | ||||
|     const {newIndex, namedImports} = this.getNamedImports(index); | ||||
|     index = newIndex; | ||||
| 
 | ||||
|     if (this.tokens.matchesContextualAtIndex(index, ContextualKeyword._from)) { | ||||
|       index++; | ||||
|     } else { | ||||
|       // Reinterpret "a as b" to be local/exported rather than imported/local.
 | ||||
|       for (const {importedName: localName, localName: exportedName} of namedImports) { | ||||
|         this.addExportBinding(localName, exportedName); | ||||
|       } | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     if (!this.tokens.matches1AtIndex(index, tt.string)) { | ||||
|       throw new Error("Expected string token at the end of import statement."); | ||||
|     } | ||||
|     const path = this.tokens.stringValueAtIndex(index); | ||||
|     const importInfo = this.getImportInfo(path); | ||||
|     importInfo.namedExports.push(...namedImports); | ||||
|   } | ||||
| 
 | ||||
|    preprocessExportStarAtIndex(index) { | ||||
|     let exportedName = null; | ||||
|     if (this.tokens.matches3AtIndex(index, tt._export, tt.star, tt._as)) { | ||||
|       // export * as
 | ||||
|       index += 3; | ||||
|       exportedName = this.tokens.identifierNameAtIndex(index); | ||||
|       // foo from
 | ||||
|       index += 2; | ||||
|     } else { | ||||
|       // export * from
 | ||||
|       index += 3; | ||||
|     } | ||||
|     if (!this.tokens.matches1AtIndex(index, tt.string)) { | ||||
|       throw new Error("Expected string token at the end of star export statement."); | ||||
|     } | ||||
|     const path = this.tokens.stringValueAtIndex(index); | ||||
|     const importInfo = this.getImportInfo(path); | ||||
|     if (exportedName !== null) { | ||||
|       importInfo.exportStarNames.push(exportedName); | ||||
|     } else { | ||||
|       importInfo.hasStarExport = true; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    getNamedImports(index) { | ||||
|     const namedImports = []; | ||||
|     while (true) { | ||||
|       if (this.tokens.matches1AtIndex(index, tt.braceR)) { | ||||
|         index++; | ||||
|         break; | ||||
|       } | ||||
| 
 | ||||
|       const specifierInfo = getImportExportSpecifierInfo(this.tokens, index); | ||||
|       index = specifierInfo.endIndex; | ||||
|       if (!specifierInfo.isType) { | ||||
|         namedImports.push({ | ||||
|           importedName: specifierInfo.leftName, | ||||
|           localName: specifierInfo.rightName, | ||||
|         }); | ||||
|       } | ||||
| 
 | ||||
|       if (this.tokens.matches2AtIndex(index, tt.comma, tt.braceR)) { | ||||
|         index += 2; | ||||
|         break; | ||||
|       } else if (this.tokens.matches1AtIndex(index, tt.braceR)) { | ||||
|         index++; | ||||
|         break; | ||||
|       } else if (this.tokens.matches1AtIndex(index, tt.comma)) { | ||||
|         index++; | ||||
|       } else { | ||||
|         throw new Error(`Unexpected token: ${JSON.stringify(this.tokens.tokens[index])}`); | ||||
|       } | ||||
|     } | ||||
|     return {newIndex: index, namedImports}; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Get a mutable import info object for this path, creating one if it doesn't | ||||
|    * exist yet. | ||||
|    */ | ||||
|    getImportInfo(path) { | ||||
|     const existingInfo = this.importInfoByPath.get(path); | ||||
|     if (existingInfo) { | ||||
|       return existingInfo; | ||||
|     } | ||||
|     const newInfo = { | ||||
|       defaultNames: [], | ||||
|       wildcardNames: [], | ||||
|       namedImports: [], | ||||
|       namedExports: [], | ||||
|       hasBareImport: false, | ||||
|       exportStarNames: [], | ||||
|       hasStarExport: false, | ||||
|     }; | ||||
|     this.importInfoByPath.set(path, newInfo); | ||||
|     return newInfo; | ||||
|   } | ||||
| 
 | ||||
|    addExportBinding(localName, exportedName) { | ||||
|     if (!this.exportBindingsByLocalName.has(localName)) { | ||||
|       this.exportBindingsByLocalName.set(localName, []); | ||||
|     } | ||||
|     this.exportBindingsByLocalName.get(localName).push(exportedName); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Return the code to use for the import for this path, or the empty string if | ||||
|    * the code has already been "claimed" by a previous import. | ||||
|    */ | ||||
|   claimImportCode(importPath) { | ||||
|     const result = this.importsToReplace.get(importPath); | ||||
|     this.importsToReplace.set(importPath, ""); | ||||
|     return result || ""; | ||||
|   } | ||||
| 
 | ||||
|   getIdentifierReplacement(identifierName) { | ||||
|     return this.identifierReplacements.get(identifierName) || null; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Return a string like `exports.foo = exports.bar`. | ||||
|    */ | ||||
|   resolveExportBinding(assignedName) { | ||||
|     const exportedNames = this.exportBindingsByLocalName.get(assignedName); | ||||
|     if (!exportedNames || exportedNames.length === 0) { | ||||
|       return null; | ||||
|     } | ||||
|     return exportedNames.map((exportedName) => `exports.${exportedName}`).join(" = "); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Return all imported/exported names where we might be interested in whether usages of those | ||||
|    * names are shadowed. | ||||
|    */ | ||||
|   getGlobalNames() { | ||||
|     return new Set([ | ||||
|       ...this.identifierReplacements.keys(), | ||||
|       ...this.exportBindingsByLocalName.keys(), | ||||
|     ]); | ||||
|   } | ||||
| } | ||||
							
								
								
									
										176
									
								
								node_modules/sucrase/dist/esm/HelperManager.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										176
									
								
								node_modules/sucrase/dist/esm/HelperManager.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,176 @@ | |||
| 
 | ||||
| 
 | ||||
| const HELPERS = { | ||||
|   require: ` | ||||
|     import {createRequire as CREATE_REQUIRE_NAME} from "module"; | ||||
|     const require = CREATE_REQUIRE_NAME(import.meta.url); | ||||
|   `,
 | ||||
|   interopRequireWildcard: ` | ||||
|     function interopRequireWildcard(obj) { | ||||
|       if (obj && obj.__esModule) { | ||||
|         return obj; | ||||
|       } else { | ||||
|         var newObj = {}; | ||||
|         if (obj != null) { | ||||
|           for (var key in obj) { | ||||
|             if (Object.prototype.hasOwnProperty.call(obj, key)) { | ||||
|               newObj[key] = obj[key]; | ||||
|             } | ||||
|           } | ||||
|         } | ||||
|         newObj.default = obj; | ||||
|         return newObj; | ||||
|       } | ||||
|     } | ||||
|   `,
 | ||||
|   interopRequireDefault: ` | ||||
|     function interopRequireDefault(obj) { | ||||
|       return obj && obj.__esModule ? obj : { default: obj }; | ||||
|     } | ||||
|   `,
 | ||||
|   createNamedExportFrom: ` | ||||
|     function createNamedExportFrom(obj, localName, importedName) { | ||||
|       Object.defineProperty(exports, localName, {enumerable: true, configurable: true, get: () => obj[importedName]}); | ||||
|     } | ||||
|   `,
 | ||||
|   // Note that TypeScript and Babel do this differently; TypeScript does a simple existence
 | ||||
|   // check in the exports object and does a plain assignment, whereas Babel uses
 | ||||
|   // defineProperty and builds an object of explicitly-exported names so that star exports can
 | ||||
|   // always take lower precedence. For now, we do the easier TypeScript thing.
 | ||||
|   createStarExport: ` | ||||
|     function createStarExport(obj) { | ||||
|       Object.keys(obj) | ||||
|         .filter((key) => key !== "default" && key !== "__esModule") | ||||
|         .forEach((key) => { | ||||
|           if (exports.hasOwnProperty(key)) { | ||||
|             return; | ||||
|           } | ||||
|           Object.defineProperty(exports, key, {enumerable: true, configurable: true, get: () => obj[key]}); | ||||
|         }); | ||||
|     } | ||||
|   `,
 | ||||
|   nullishCoalesce: ` | ||||
|     function nullishCoalesce(lhs, rhsFn) { | ||||
|       if (lhs != null) { | ||||
|         return lhs; | ||||
|       } else { | ||||
|         return rhsFn(); | ||||
|       } | ||||
|     } | ||||
|   `,
 | ||||
|   asyncNullishCoalesce: ` | ||||
|     async function asyncNullishCoalesce(lhs, rhsFn) { | ||||
|       if (lhs != null) { | ||||
|         return lhs; | ||||
|       } else { | ||||
|         return await rhsFn(); | ||||
|       } | ||||
|     } | ||||
|   `,
 | ||||
|   optionalChain: ` | ||||
|     function optionalChain(ops) { | ||||
|       let lastAccessLHS = undefined; | ||||
|       let value = ops[0]; | ||||
|       let i = 1; | ||||
|       while (i < ops.length) { | ||||
|         const op = ops[i]; | ||||
|         const fn = ops[i + 1]; | ||||
|         i += 2; | ||||
|         if ((op === 'optionalAccess' || op === 'optionalCall') && value == null) { | ||||
|           return undefined; | ||||
|         } | ||||
|         if (op === 'access' || op === 'optionalAccess') { | ||||
|           lastAccessLHS = value; | ||||
|           value = fn(value); | ||||
|         } else if (op === 'call' || op === 'optionalCall') { | ||||
|           value = fn((...args) => value.call(lastAccessLHS, ...args)); | ||||
|           lastAccessLHS = undefined; | ||||
|         } | ||||
|       } | ||||
|       return value; | ||||
|     } | ||||
|   `,
 | ||||
|   asyncOptionalChain: ` | ||||
|     async function asyncOptionalChain(ops) { | ||||
|       let lastAccessLHS = undefined; | ||||
|       let value = ops[0]; | ||||
|       let i = 1; | ||||
|       while (i < ops.length) { | ||||
|         const op = ops[i]; | ||||
|         const fn = ops[i + 1]; | ||||
|         i += 2; | ||||
|         if ((op === 'optionalAccess' || op === 'optionalCall') && value == null) { | ||||
|           return undefined; | ||||
|         } | ||||
|         if (op === 'access' || op === 'optionalAccess') { | ||||
|           lastAccessLHS = value; | ||||
|           value = await fn(value); | ||||
|         } else if (op === 'call' || op === 'optionalCall') { | ||||
|           value = await fn((...args) => value.call(lastAccessLHS, ...args)); | ||||
|           lastAccessLHS = undefined; | ||||
|         } | ||||
|       } | ||||
|       return value; | ||||
|     } | ||||
|   `,
 | ||||
|   optionalChainDelete: ` | ||||
|     function optionalChainDelete(ops) { | ||||
|       const result = OPTIONAL_CHAIN_NAME(ops); | ||||
|       return result == null ? true : result; | ||||
|     } | ||||
|   `,
 | ||||
|   asyncOptionalChainDelete: ` | ||||
|     async function asyncOptionalChainDelete(ops) { | ||||
|       const result = await ASYNC_OPTIONAL_CHAIN_NAME(ops); | ||||
|       return result == null ? true : result; | ||||
|     } | ||||
|   `,
 | ||||
| }; | ||||
| 
 | ||||
| export class HelperManager { | ||||
|   __init() {this.helperNames = {}} | ||||
|   __init2() {this.createRequireName = null} | ||||
|   constructor( nameManager) {;this.nameManager = nameManager;HelperManager.prototype.__init.call(this);HelperManager.prototype.__init2.call(this);} | ||||
| 
 | ||||
|   getHelperName(baseName) { | ||||
|     let helperName = this.helperNames[baseName]; | ||||
|     if (helperName) { | ||||
|       return helperName; | ||||
|     } | ||||
|     helperName = this.nameManager.claimFreeName(`_${baseName}`); | ||||
|     this.helperNames[baseName] = helperName; | ||||
|     return helperName; | ||||
|   } | ||||
| 
 | ||||
|   emitHelpers() { | ||||
|     let resultCode = ""; | ||||
|     if (this.helperNames.optionalChainDelete) { | ||||
|       this.getHelperName("optionalChain"); | ||||
|     } | ||||
|     if (this.helperNames.asyncOptionalChainDelete) { | ||||
|       this.getHelperName("asyncOptionalChain"); | ||||
|     } | ||||
|     for (const [baseName, helperCodeTemplate] of Object.entries(HELPERS)) { | ||||
|       const helperName = this.helperNames[baseName]; | ||||
|       let helperCode = helperCodeTemplate; | ||||
|       if (baseName === "optionalChainDelete") { | ||||
|         helperCode = helperCode.replace("OPTIONAL_CHAIN_NAME", this.helperNames.optionalChain); | ||||
|       } else if (baseName === "asyncOptionalChainDelete") { | ||||
|         helperCode = helperCode.replace( | ||||
|           "ASYNC_OPTIONAL_CHAIN_NAME", | ||||
|           this.helperNames.asyncOptionalChain, | ||||
|         ); | ||||
|       } else if (baseName === "require") { | ||||
|         if (this.createRequireName === null) { | ||||
|           this.createRequireName = this.nameManager.claimFreeName("_createRequire"); | ||||
|         } | ||||
|         helperCode = helperCode.replace(/CREATE_REQUIRE_NAME/g, this.createRequireName); | ||||
|       } | ||||
|       if (helperName) { | ||||
|         resultCode += " "; | ||||
|         resultCode += helperCode.replace(baseName, helperName).replace(/\s+/g, " ").trim(); | ||||
|       } | ||||
|     } | ||||
|     return resultCode; | ||||
|   } | ||||
| } | ||||
							
								
								
									
										27
									
								
								node_modules/sucrase/dist/esm/NameManager.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										27
									
								
								node_modules/sucrase/dist/esm/NameManager.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,27 @@ | |||
| 
 | ||||
| import getIdentifierNames from "./util/getIdentifierNames"; | ||||
| 
 | ||||
| export default class NameManager { | ||||
|     __init() {this.usedNames = new Set()} | ||||
| 
 | ||||
|   constructor(code, tokens) {;NameManager.prototype.__init.call(this); | ||||
|     this.usedNames = new Set(getIdentifierNames(code, tokens)); | ||||
|   } | ||||
| 
 | ||||
|   claimFreeName(name) { | ||||
|     const newName = this.findFreeName(name); | ||||
|     this.usedNames.add(newName); | ||||
|     return newName; | ||||
|   } | ||||
| 
 | ||||
|   findFreeName(name) { | ||||
|     if (!this.usedNames.has(name)) { | ||||
|       return name; | ||||
|     } | ||||
|     let suffixNum = 2; | ||||
|     while (this.usedNames.has(name + String(suffixNum))) { | ||||
|       suffixNum++; | ||||
|     } | ||||
|     return name + String(suffixNum); | ||||
|   } | ||||
| } | ||||
							
								
								
									
										42
									
								
								node_modules/sucrase/dist/esm/Options-gen-types.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										42
									
								
								node_modules/sucrase/dist/esm/Options-gen-types.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,42 @@ | |||
| /** | ||||
|  * This module was automatically generated by `ts-interface-builder` | ||||
|  */ | ||||
| import * as t from "ts-interface-checker"; | ||||
| // tslint:disable:object-literal-key-quotes
 | ||||
| 
 | ||||
| export const Transform = t.union( | ||||
|   t.lit("jsx"), | ||||
|   t.lit("typescript"), | ||||
|   t.lit("flow"), | ||||
|   t.lit("imports"), | ||||
|   t.lit("react-hot-loader"), | ||||
|   t.lit("jest"), | ||||
| ); | ||||
| 
 | ||||
| export const SourceMapOptions = t.iface([], { | ||||
|   compiledFilename: "string", | ||||
| }); | ||||
| 
 | ||||
| export const Options = t.iface([], { | ||||
|   transforms: t.array("Transform"), | ||||
|   disableESTransforms: t.opt("boolean"), | ||||
|   jsxRuntime: t.opt(t.union(t.lit("classic"), t.lit("automatic"), t.lit("preserve"))), | ||||
|   production: t.opt("boolean"), | ||||
|   jsxImportSource: t.opt("string"), | ||||
|   jsxPragma: t.opt("string"), | ||||
|   jsxFragmentPragma: t.opt("string"), | ||||
|   keepUnusedImports: t.opt("boolean"), | ||||
|   preserveDynamicImport: t.opt("boolean"), | ||||
|   injectCreateRequireForImportRequire: t.opt("boolean"), | ||||
|   enableLegacyTypeScriptModuleInterop: t.opt("boolean"), | ||||
|   enableLegacyBabel5ModuleInterop: t.opt("boolean"), | ||||
|   sourceMapOptions: t.opt("SourceMapOptions"), | ||||
|   filePath: t.opt("string"), | ||||
| }); | ||||
| 
 | ||||
| const exportedTypeSuite = { | ||||
|   Transform, | ||||
|   SourceMapOptions, | ||||
|   Options, | ||||
| }; | ||||
| export default exportedTypeSuite; | ||||
							
								
								
									
										101
									
								
								node_modules/sucrase/dist/esm/Options.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										101
									
								
								node_modules/sucrase/dist/esm/Options.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,101 @@ | |||
| import {createCheckers} from "ts-interface-checker"; | ||||
| 
 | ||||
| import OptionsGenTypes from "./Options-gen-types"; | ||||
| 
 | ||||
| const {Options: OptionsChecker} = createCheckers(OptionsGenTypes); | ||||
| 
 | ||||
|   | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| export function validateOptions(options) { | ||||
|   OptionsChecker.strictCheck(options); | ||||
| } | ||||
							
								
								
									
										357
									
								
								node_modules/sucrase/dist/esm/TokenProcessor.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										357
									
								
								node_modules/sucrase/dist/esm/TokenProcessor.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,357 @@ | |||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| import { TokenType as tt} from "./parser/tokenizer/types"; | ||||
| import isAsyncOperation from "./util/isAsyncOperation"; | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| export default class TokenProcessor { | ||||
|    __init() {this.resultCode = ""} | ||||
|   // Array mapping input token index to optional string index position in the
 | ||||
|   // output code.
 | ||||
|    __init2() {this.resultMappings = new Array(this.tokens.length)} | ||||
|    __init3() {this.tokenIndex = 0} | ||||
| 
 | ||||
|   constructor( | ||||
|      code, | ||||
|      tokens, | ||||
|      isFlowEnabled, | ||||
|      disableESTransforms, | ||||
|      helperManager, | ||||
|   ) {;this.code = code;this.tokens = tokens;this.isFlowEnabled = isFlowEnabled;this.disableESTransforms = disableESTransforms;this.helperManager = helperManager;TokenProcessor.prototype.__init.call(this);TokenProcessor.prototype.__init2.call(this);TokenProcessor.prototype.__init3.call(this);} | ||||
| 
 | ||||
|   /** | ||||
|    * Snapshot the token state in a way that can be restored later, useful for | ||||
|    * things like lookahead. | ||||
|    * | ||||
|    * resultMappings do not need to be copied since in all use cases, they will | ||||
|    * be overwritten anyway after restore. | ||||
|    */ | ||||
|   snapshot() { | ||||
|     return { | ||||
|       resultCode: this.resultCode, | ||||
|       tokenIndex: this.tokenIndex, | ||||
|     }; | ||||
|   } | ||||
| 
 | ||||
|   restoreToSnapshot(snapshot) { | ||||
|     this.resultCode = snapshot.resultCode; | ||||
|     this.tokenIndex = snapshot.tokenIndex; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Remove and return the code generated since the snapshot, leaving the | ||||
|    * current token position in-place. Unlike most TokenProcessor operations, | ||||
|    * this operation can result in input/output line number mismatches because | ||||
|    * the removed code may contain newlines, so this operation should be used | ||||
|    * sparingly. | ||||
|    */ | ||||
|   dangerouslyGetAndRemoveCodeSinceSnapshot(snapshot) { | ||||
|     const result = this.resultCode.slice(snapshot.resultCode.length); | ||||
|     this.resultCode = snapshot.resultCode; | ||||
|     return result; | ||||
|   } | ||||
| 
 | ||||
|   reset() { | ||||
|     this.resultCode = ""; | ||||
|     this.resultMappings = new Array(this.tokens.length); | ||||
|     this.tokenIndex = 0; | ||||
|   } | ||||
| 
 | ||||
|   matchesContextualAtIndex(index, contextualKeyword) { | ||||
|     return ( | ||||
|       this.matches1AtIndex(index, tt.name) && | ||||
|       this.tokens[index].contextualKeyword === contextualKeyword | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|   identifierNameAtIndex(index) { | ||||
|     // TODO: We need to process escapes since technically you can have unicode escapes in variable
 | ||||
|     // names.
 | ||||
|     return this.identifierNameForToken(this.tokens[index]); | ||||
|   } | ||||
| 
 | ||||
|   identifierNameAtRelativeIndex(relativeIndex) { | ||||
|     return this.identifierNameForToken(this.tokenAtRelativeIndex(relativeIndex)); | ||||
|   } | ||||
| 
 | ||||
|   identifierName() { | ||||
|     return this.identifierNameForToken(this.currentToken()); | ||||
|   } | ||||
| 
 | ||||
|   identifierNameForToken(token) { | ||||
|     return this.code.slice(token.start, token.end); | ||||
|   } | ||||
| 
 | ||||
|   rawCodeForToken(token) { | ||||
|     return this.code.slice(token.start, token.end); | ||||
|   } | ||||
| 
 | ||||
|   stringValueAtIndex(index) { | ||||
|     return this.stringValueForToken(this.tokens[index]); | ||||
|   } | ||||
| 
 | ||||
|   stringValue() { | ||||
|     return this.stringValueForToken(this.currentToken()); | ||||
|   } | ||||
| 
 | ||||
|   stringValueForToken(token) { | ||||
|     // This is used to identify when two imports are the same and to resolve TypeScript enum keys.
 | ||||
|     // Ideally we'd process escapes within the strings, but for now we pretty much take the raw
 | ||||
|     // code.
 | ||||
|     return this.code.slice(token.start + 1, token.end - 1); | ||||
|   } | ||||
| 
 | ||||
|   matches1AtIndex(index, t1) { | ||||
|     return this.tokens[index].type === t1; | ||||
|   } | ||||
| 
 | ||||
|   matches2AtIndex(index, t1, t2) { | ||||
|     return this.tokens[index].type === t1 && this.tokens[index + 1].type === t2; | ||||
|   } | ||||
| 
 | ||||
|   matches3AtIndex(index, t1, t2, t3) { | ||||
|     return ( | ||||
|       this.tokens[index].type === t1 && | ||||
|       this.tokens[index + 1].type === t2 && | ||||
|       this.tokens[index + 2].type === t3 | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|   matches1(t1) { | ||||
|     return this.tokens[this.tokenIndex].type === t1; | ||||
|   } | ||||
| 
 | ||||
|   matches2(t1, t2) { | ||||
|     return this.tokens[this.tokenIndex].type === t1 && this.tokens[this.tokenIndex + 1].type === t2; | ||||
|   } | ||||
| 
 | ||||
|   matches3(t1, t2, t3) { | ||||
|     return ( | ||||
|       this.tokens[this.tokenIndex].type === t1 && | ||||
|       this.tokens[this.tokenIndex + 1].type === t2 && | ||||
|       this.tokens[this.tokenIndex + 2].type === t3 | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|   matches4(t1, t2, t3, t4) { | ||||
|     return ( | ||||
|       this.tokens[this.tokenIndex].type === t1 && | ||||
|       this.tokens[this.tokenIndex + 1].type === t2 && | ||||
|       this.tokens[this.tokenIndex + 2].type === t3 && | ||||
|       this.tokens[this.tokenIndex + 3].type === t4 | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|   matches5(t1, t2, t3, t4, t5) { | ||||
|     return ( | ||||
|       this.tokens[this.tokenIndex].type === t1 && | ||||
|       this.tokens[this.tokenIndex + 1].type === t2 && | ||||
|       this.tokens[this.tokenIndex + 2].type === t3 && | ||||
|       this.tokens[this.tokenIndex + 3].type === t4 && | ||||
|       this.tokens[this.tokenIndex + 4].type === t5 | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|   matchesContextual(contextualKeyword) { | ||||
|     return this.matchesContextualAtIndex(this.tokenIndex, contextualKeyword); | ||||
|   } | ||||
| 
 | ||||
|   matchesContextIdAndLabel(type, contextId) { | ||||
|     return this.matches1(type) && this.currentToken().contextId === contextId; | ||||
|   } | ||||
| 
 | ||||
|   previousWhitespaceAndComments() { | ||||
|     let whitespaceAndComments = this.code.slice( | ||||
|       this.tokenIndex > 0 ? this.tokens[this.tokenIndex - 1].end : 0, | ||||
|       this.tokenIndex < this.tokens.length ? this.tokens[this.tokenIndex].start : this.code.length, | ||||
|     ); | ||||
|     if (this.isFlowEnabled) { | ||||
|       whitespaceAndComments = whitespaceAndComments.replace(/@flow/g, ""); | ||||
|     } | ||||
|     return whitespaceAndComments; | ||||
|   } | ||||
| 
 | ||||
|   replaceToken(newCode) { | ||||
|     this.resultCode += this.previousWhitespaceAndComments(); | ||||
|     this.appendTokenPrefix(); | ||||
|     this.resultMappings[this.tokenIndex] = this.resultCode.length; | ||||
|     this.resultCode += newCode; | ||||
|     this.appendTokenSuffix(); | ||||
|     this.tokenIndex++; | ||||
|   } | ||||
| 
 | ||||
|   replaceTokenTrimmingLeftWhitespace(newCode) { | ||||
|     this.resultCode += this.previousWhitespaceAndComments().replace(/[^\r\n]/g, ""); | ||||
|     this.appendTokenPrefix(); | ||||
|     this.resultMappings[this.tokenIndex] = this.resultCode.length; | ||||
|     this.resultCode += newCode; | ||||
|     this.appendTokenSuffix(); | ||||
|     this.tokenIndex++; | ||||
|   } | ||||
| 
 | ||||
|   removeInitialToken() { | ||||
|     this.replaceToken(""); | ||||
|   } | ||||
| 
 | ||||
|   removeToken() { | ||||
|     this.replaceTokenTrimmingLeftWhitespace(""); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Remove all code until the next }, accounting for balanced braces. | ||||
|    */ | ||||
|   removeBalancedCode() { | ||||
|     let braceDepth = 0; | ||||
|     while (!this.isAtEnd()) { | ||||
|       if (this.matches1(tt.braceL)) { | ||||
|         braceDepth++; | ||||
|       } else if (this.matches1(tt.braceR)) { | ||||
|         if (braceDepth === 0) { | ||||
|           return; | ||||
|         } | ||||
|         braceDepth--; | ||||
|       } | ||||
|       this.removeToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   copyExpectedToken(tokenType) { | ||||
|     if (this.tokens[this.tokenIndex].type !== tokenType) { | ||||
|       throw new Error(`Expected token ${tokenType}`); | ||||
|     } | ||||
|     this.copyToken(); | ||||
|   } | ||||
| 
 | ||||
|   copyToken() { | ||||
|     this.resultCode += this.previousWhitespaceAndComments(); | ||||
|     this.appendTokenPrefix(); | ||||
|     this.resultMappings[this.tokenIndex] = this.resultCode.length; | ||||
|     this.resultCode += this.code.slice( | ||||
|       this.tokens[this.tokenIndex].start, | ||||
|       this.tokens[this.tokenIndex].end, | ||||
|     ); | ||||
|     this.appendTokenSuffix(); | ||||
|     this.tokenIndex++; | ||||
|   } | ||||
| 
 | ||||
|   copyTokenWithPrefix(prefix) { | ||||
|     this.resultCode += this.previousWhitespaceAndComments(); | ||||
|     this.appendTokenPrefix(); | ||||
|     this.resultCode += prefix; | ||||
|     this.resultMappings[this.tokenIndex] = this.resultCode.length; | ||||
|     this.resultCode += this.code.slice( | ||||
|       this.tokens[this.tokenIndex].start, | ||||
|       this.tokens[this.tokenIndex].end, | ||||
|     ); | ||||
|     this.appendTokenSuffix(); | ||||
|     this.tokenIndex++; | ||||
|   } | ||||
| 
 | ||||
|    appendTokenPrefix() { | ||||
|     const token = this.currentToken(); | ||||
|     if (token.numNullishCoalesceStarts || token.isOptionalChainStart) { | ||||
|       token.isAsyncOperation = isAsyncOperation(this); | ||||
|     } | ||||
|     if (this.disableESTransforms) { | ||||
|       return; | ||||
|     } | ||||
|     if (token.numNullishCoalesceStarts) { | ||||
|       for (let i = 0; i < token.numNullishCoalesceStarts; i++) { | ||||
|         if (token.isAsyncOperation) { | ||||
|           this.resultCode += "await "; | ||||
|           this.resultCode += this.helperManager.getHelperName("asyncNullishCoalesce"); | ||||
|         } else { | ||||
|           this.resultCode += this.helperManager.getHelperName("nullishCoalesce"); | ||||
|         } | ||||
|         this.resultCode += "("; | ||||
|       } | ||||
|     } | ||||
|     if (token.isOptionalChainStart) { | ||||
|       if (token.isAsyncOperation) { | ||||
|         this.resultCode += "await "; | ||||
|       } | ||||
|       if (this.tokenIndex > 0 && this.tokenAtRelativeIndex(-1).type === tt._delete) { | ||||
|         if (token.isAsyncOperation) { | ||||
|           this.resultCode += this.helperManager.getHelperName("asyncOptionalChainDelete"); | ||||
|         } else { | ||||
|           this.resultCode += this.helperManager.getHelperName("optionalChainDelete"); | ||||
|         } | ||||
|       } else if (token.isAsyncOperation) { | ||||
|         this.resultCode += this.helperManager.getHelperName("asyncOptionalChain"); | ||||
|       } else { | ||||
|         this.resultCode += this.helperManager.getHelperName("optionalChain"); | ||||
|       } | ||||
|       this.resultCode += "(["; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    appendTokenSuffix() { | ||||
|     const token = this.currentToken(); | ||||
|     if (token.isOptionalChainEnd && !this.disableESTransforms) { | ||||
|       this.resultCode += "])"; | ||||
|     } | ||||
|     if (token.numNullishCoalesceEnds && !this.disableESTransforms) { | ||||
|       for (let i = 0; i < token.numNullishCoalesceEnds; i++) { | ||||
|         this.resultCode += "))"; | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   appendCode(code) { | ||||
|     this.resultCode += code; | ||||
|   } | ||||
| 
 | ||||
|   currentToken() { | ||||
|     return this.tokens[this.tokenIndex]; | ||||
|   } | ||||
| 
 | ||||
|   currentTokenCode() { | ||||
|     const token = this.currentToken(); | ||||
|     return this.code.slice(token.start, token.end); | ||||
|   } | ||||
| 
 | ||||
|   tokenAtRelativeIndex(relativeIndex) { | ||||
|     return this.tokens[this.tokenIndex + relativeIndex]; | ||||
|   } | ||||
| 
 | ||||
|   currentIndex() { | ||||
|     return this.tokenIndex; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Move to the next token. Only suitable in preprocessing steps. When | ||||
|    * generating new code, you should use copyToken or removeToken. | ||||
|    */ | ||||
|   nextToken() { | ||||
|     if (this.tokenIndex === this.tokens.length) { | ||||
|       throw new Error("Unexpectedly reached end of input."); | ||||
|     } | ||||
|     this.tokenIndex++; | ||||
|   } | ||||
| 
 | ||||
|   previousToken() { | ||||
|     this.tokenIndex--; | ||||
|   } | ||||
| 
 | ||||
|   finish() { | ||||
|     if (this.tokenIndex !== this.tokens.length) { | ||||
|       throw new Error("Tried to finish processing tokens before reaching the end."); | ||||
|     } | ||||
|     this.resultCode += this.previousWhitespaceAndComments(); | ||||
|     return {code: this.resultCode, mappings: this.resultMappings}; | ||||
|   } | ||||
| 
 | ||||
|   isAtEnd() { | ||||
|     return this.tokenIndex === this.tokens.length; | ||||
|   } | ||||
| } | ||||
							
								
								
									
										320
									
								
								node_modules/sucrase/dist/esm/cli.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										320
									
								
								node_modules/sucrase/dist/esm/cli.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,320 @@ | |||
| /* eslint-disable no-console */ | ||||
| import commander from "commander"; | ||||
| import globCb from "glob"; | ||||
| import {exists, mkdir, readdir, readFile, stat, writeFile} from "mz/fs"; | ||||
| import {dirname, join, relative} from "path"; | ||||
| import {promisify} from "util"; | ||||
| 
 | ||||
| import { transform} from "./index"; | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| const glob = promisify(globCb); | ||||
| 
 | ||||
| export default function run() { | ||||
|   commander | ||||
|     .description(`Sucrase: super-fast Babel alternative.`) | ||||
|     .usage("[options] <srcDir>") | ||||
|     .option( | ||||
|       "-d, --out-dir <out>", | ||||
|       "Compile an input directory of modules into an output directory.", | ||||
|     ) | ||||
|     .option( | ||||
|       "-p, --project <dir>", | ||||
|       "Compile a TypeScript project, will read from tsconfig.json in <dir>", | ||||
|     ) | ||||
|     .option("--out-extension <extension>", "File extension to use for all output files.", "js") | ||||
|     .option("--exclude-dirs <paths>", "Names of directories that should not be traversed.") | ||||
|     .option("-q, --quiet", "Don't print the names of converted files.") | ||||
|     .option("-t, --transforms <transforms>", "Comma-separated list of transforms to run.") | ||||
|     .option("--disable-es-transforms", "Opt out of all ES syntax transforms.") | ||||
|     .option("--jsx-runtime <string>", "Transformation mode for the JSX transform.") | ||||
|     .option("--production", "Disable debugging information from JSX in output.") | ||||
|     .option( | ||||
|       "--jsx-import-source <string>", | ||||
|       "Automatic JSX transform import path prefix, defaults to `React.Fragment`.", | ||||
|     ) | ||||
|     .option( | ||||
|       "--jsx-pragma <string>", | ||||
|       "Classic JSX transform element creation function, defaults to `React.createElement`.", | ||||
|     ) | ||||
|     .option( | ||||
|       "--jsx-fragment-pragma <string>", | ||||
|       "Classic JSX transform fragment component, defaults to `React.Fragment`.", | ||||
|     ) | ||||
|     .option("--keep-unused-imports", "Disable automatic removal of type-only imports/exports.") | ||||
|     .option("--preserve-dynamic-import", "Don't transpile dynamic import() to require.") | ||||
|     .option( | ||||
|       "--inject-create-require-for-import-require", | ||||
|       "Use `createRequire` when transpiling TS `import = require` to ESM.", | ||||
|     ) | ||||
|     .option( | ||||
|       "--enable-legacy-typescript-module-interop", | ||||
|       "Use default TypeScript ESM/CJS interop strategy.", | ||||
|     ) | ||||
|     .option("--enable-legacy-babel5-module-interop", "Use Babel 5 ESM/CJS interop strategy.") | ||||
|     .parse(process.argv); | ||||
| 
 | ||||
|   if (commander.project) { | ||||
|     if ( | ||||
|       commander.outDir || | ||||
|       commander.transforms || | ||||
|       commander.args[0] || | ||||
|       commander.enableLegacyTypescriptModuleInterop | ||||
|     ) { | ||||
|       console.error( | ||||
|         "If TypeScript project is specified, out directory, transforms, source " + | ||||
|           "directory, and --enable-legacy-typescript-module-interop may not be specified.", | ||||
|       ); | ||||
|       process.exit(1); | ||||
|     } | ||||
|   } else { | ||||
|     if (!commander.outDir) { | ||||
|       console.error("Out directory is required"); | ||||
|       process.exit(1); | ||||
|     } | ||||
| 
 | ||||
|     if (!commander.transforms) { | ||||
|       console.error("Transforms option is required."); | ||||
|       process.exit(1); | ||||
|     } | ||||
| 
 | ||||
|     if (!commander.args[0]) { | ||||
|       console.error("Source directory is required."); | ||||
|       process.exit(1); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   const options = { | ||||
|     outDirPath: commander.outDir, | ||||
|     srcDirPath: commander.args[0], | ||||
|     project: commander.project, | ||||
|     outExtension: commander.outExtension, | ||||
|     excludeDirs: commander.excludeDirs ? commander.excludeDirs.split(",") : [], | ||||
|     quiet: commander.quiet, | ||||
|     sucraseOptions: { | ||||
|       transforms: commander.transforms ? commander.transforms.split(",") : [], | ||||
|       disableESTransforms: commander.disableEsTransforms, | ||||
|       jsxRuntime: commander.jsxRuntime, | ||||
|       production: commander.production, | ||||
|       jsxImportSource: commander.jsxImportSource, | ||||
|       jsxPragma: commander.jsxPragma || "React.createElement", | ||||
|       jsxFragmentPragma: commander.jsxFragmentPragma || "React.Fragment", | ||||
|       keepUnusedImports: commander.keepUnusedImports, | ||||
|       preserveDynamicImport: commander.preserveDynamicImport, | ||||
|       injectCreateRequireForImportRequire: commander.injectCreateRequireForImportRequire, | ||||
|       enableLegacyTypeScriptModuleInterop: commander.enableLegacyTypescriptModuleInterop, | ||||
|       enableLegacyBabel5ModuleInterop: commander.enableLegacyBabel5ModuleInterop, | ||||
|     }, | ||||
|   }; | ||||
| 
 | ||||
|   buildDirectory(options).catch((e) => { | ||||
|     process.exitCode = 1; | ||||
|     console.error(e); | ||||
|   }); | ||||
| } | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| async function findFiles(options) { | ||||
|   const outDirPath = options.outDirPath; | ||||
|   const srcDirPath = options.srcDirPath; | ||||
| 
 | ||||
|   const extensions = options.sucraseOptions.transforms.includes("typescript") | ||||
|     ? [".ts", ".tsx"] | ||||
|     : [".js", ".jsx"]; | ||||
| 
 | ||||
|   if (!(await exists(outDirPath))) { | ||||
|     await mkdir(outDirPath); | ||||
|   } | ||||
| 
 | ||||
|   const outArr = []; | ||||
|   for (const child of await readdir(srcDirPath)) { | ||||
|     if (["node_modules", ".git"].includes(child) || options.excludeDirs.includes(child)) { | ||||
|       continue; | ||||
|     } | ||||
|     const srcChildPath = join(srcDirPath, child); | ||||
|     const outChildPath = join(outDirPath, child); | ||||
|     if ((await stat(srcChildPath)).isDirectory()) { | ||||
|       const innerOptions = {...options}; | ||||
|       innerOptions.srcDirPath = srcChildPath; | ||||
|       innerOptions.outDirPath = outChildPath; | ||||
|       const innerFiles = await findFiles(innerOptions); | ||||
|       outArr.push(...innerFiles); | ||||
|     } else if (extensions.some((ext) => srcChildPath.endsWith(ext))) { | ||||
|       const outPath = outChildPath.replace(/\.\w+$/, `.${options.outExtension}`); | ||||
|       outArr.push({ | ||||
|         srcPath: srcChildPath, | ||||
|         outPath, | ||||
|       }); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   return outArr; | ||||
| } | ||||
| 
 | ||||
| async function runGlob(options) { | ||||
|   const tsConfigPath = join(options.project, "tsconfig.json"); | ||||
| 
 | ||||
|   let str; | ||||
|   try { | ||||
|     str = await readFile(tsConfigPath, "utf8"); | ||||
|   } catch (err) { | ||||
|     console.error("Could not find project tsconfig.json"); | ||||
|     console.error(`  --project=${options.project}`); | ||||
|     console.error(err); | ||||
|     process.exit(1); | ||||
|   } | ||||
|   const json = JSON.parse(str); | ||||
| 
 | ||||
|   const foundFiles = []; | ||||
| 
 | ||||
|   const files = json.files; | ||||
|   const include = json.include; | ||||
| 
 | ||||
|   const absProject = join(process.cwd(), options.project); | ||||
|   const outDirs = []; | ||||
| 
 | ||||
|   if (!(await exists(options.outDirPath))) { | ||||
|     await mkdir(options.outDirPath); | ||||
|   } | ||||
| 
 | ||||
|   if (files) { | ||||
|     for (const file of files) { | ||||
|       if (file.endsWith(".d.ts")) { | ||||
|         continue; | ||||
|       } | ||||
|       if (!file.endsWith(".ts") && !file.endsWith(".js")) { | ||||
|         continue; | ||||
|       } | ||||
| 
 | ||||
|       const srcFile = join(absProject, file); | ||||
|       const outFile = join(options.outDirPath, file); | ||||
|       const outPath = outFile.replace(/\.\w+$/, `.${options.outExtension}`); | ||||
| 
 | ||||
|       const outDir = dirname(outPath); | ||||
|       if (!outDirs.includes(outDir)) { | ||||
|         outDirs.push(outDir); | ||||
|       } | ||||
| 
 | ||||
|       foundFiles.push({ | ||||
|         srcPath: srcFile, | ||||
|         outPath, | ||||
|       }); | ||||
|     } | ||||
|   } | ||||
|   if (include) { | ||||
|     for (const pattern of include) { | ||||
|       const globFiles = await glob(join(absProject, pattern)); | ||||
|       for (const file of globFiles) { | ||||
|         if (!file.endsWith(".ts") && !file.endsWith(".js")) { | ||||
|           continue; | ||||
|         } | ||||
|         if (file.endsWith(".d.ts")) { | ||||
|           continue; | ||||
|         } | ||||
| 
 | ||||
|         const relativeFile = relative(absProject, file); | ||||
|         const outFile = join(options.outDirPath, relativeFile); | ||||
|         const outPath = outFile.replace(/\.\w+$/, `.${options.outExtension}`); | ||||
| 
 | ||||
|         const outDir = dirname(outPath); | ||||
|         if (!outDirs.includes(outDir)) { | ||||
|           outDirs.push(outDir); | ||||
|         } | ||||
| 
 | ||||
|         foundFiles.push({ | ||||
|           srcPath: file, | ||||
|           outPath, | ||||
|         }); | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   for (const outDirPath of outDirs) { | ||||
|     if (!(await exists(outDirPath))) { | ||||
|       await mkdir(outDirPath); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   // TODO: read exclude
 | ||||
| 
 | ||||
|   return foundFiles; | ||||
| } | ||||
| 
 | ||||
| async function updateOptionsFromProject(options) { | ||||
|   /** | ||||
|    * Read the project information and assign the following. | ||||
|    *  - outDirPath | ||||
|    *  - transform: imports | ||||
|    *  - transform: typescript | ||||
|    *  - enableLegacyTypescriptModuleInterop: true/false. | ||||
|    */ | ||||
| 
 | ||||
|   const tsConfigPath = join(options.project, "tsconfig.json"); | ||||
| 
 | ||||
|   let str; | ||||
|   try { | ||||
|     str = await readFile(tsConfigPath, "utf8"); | ||||
|   } catch (err) { | ||||
|     console.error("Could not find project tsconfig.json"); | ||||
|     console.error(`  --project=${options.project}`); | ||||
|     console.error(err); | ||||
|     process.exit(1); | ||||
|   } | ||||
|   const json = JSON.parse(str); | ||||
|   const sucraseOpts = options.sucraseOptions; | ||||
|   if (!sucraseOpts.transforms.includes("typescript")) { | ||||
|     sucraseOpts.transforms.push("typescript"); | ||||
|   } | ||||
| 
 | ||||
|   const compilerOpts = json.compilerOptions; | ||||
|   if (compilerOpts.outDir) { | ||||
|     options.outDirPath = join(process.cwd(), options.project, compilerOpts.outDir); | ||||
|   } | ||||
|   if (compilerOpts.esModuleInterop !== true) { | ||||
|     sucraseOpts.enableLegacyTypeScriptModuleInterop = true; | ||||
|   } | ||||
|   if (compilerOpts.module === "commonjs") { | ||||
|     if (!sucraseOpts.transforms.includes("imports")) { | ||||
|       sucraseOpts.transforms.push("imports"); | ||||
|     } | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| async function buildDirectory(options) { | ||||
|   let files; | ||||
|   if (options.outDirPath && options.srcDirPath) { | ||||
|     files = await findFiles(options); | ||||
|   } else if (options.project) { | ||||
|     await updateOptionsFromProject(options); | ||||
|     files = await runGlob(options); | ||||
|   } else { | ||||
|     console.error("Project or Source directory required."); | ||||
|     process.exit(1); | ||||
|   } | ||||
| 
 | ||||
|   for (const file of files) { | ||||
|     await buildFile(file.srcPath, file.outPath, options); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| async function buildFile(srcPath, outPath, options) { | ||||
|   if (!options.quiet) { | ||||
|     console.log(`${srcPath} -> ${outPath}`); | ||||
|   } | ||||
|   const code = (await readFile(srcPath)).toString(); | ||||
|   const transformedCode = transform(code, {...options.sucraseOptions, filePath: srcPath}).code; | ||||
|   await writeFile(outPath, transformedCode); | ||||
| } | ||||
							
								
								
									
										89
									
								
								node_modules/sucrase/dist/esm/computeSourceMap.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										89
									
								
								node_modules/sucrase/dist/esm/computeSourceMap.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,89 @@ | |||
| import {GenMapping, maybeAddSegment, toEncodedMap} from "@jridgewell/gen-mapping"; | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| import {charCodes} from "./parser/util/charcodes"; | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| /** | ||||
|  * Generate a source map indicating that each line maps directly to the original line, | ||||
|  * with the tokens in their new positions. | ||||
|  */ | ||||
| export default function computeSourceMap( | ||||
|   {code: generatedCode, mappings: rawMappings}, | ||||
|   filePath, | ||||
|   options, | ||||
|   source, | ||||
|   tokens, | ||||
| ) { | ||||
|   const sourceColumns = computeSourceColumns(source, tokens); | ||||
|   const map = new GenMapping({file: options.compiledFilename}); | ||||
|   let tokenIndex = 0; | ||||
|   // currentMapping is the output source index for the current input token being
 | ||||
|   // considered.
 | ||||
|   let currentMapping = rawMappings[0]; | ||||
|   while (currentMapping === undefined && tokenIndex < rawMappings.length - 1) { | ||||
|     tokenIndex++; | ||||
|     currentMapping = rawMappings[tokenIndex]; | ||||
|   } | ||||
|   let line = 0; | ||||
|   let lineStart = 0; | ||||
|   if (currentMapping !== lineStart) { | ||||
|     maybeAddSegment(map, line, 0, filePath, line, 0); | ||||
|   } | ||||
|   for (let i = 0; i < generatedCode.length; i++) { | ||||
|     if (i === currentMapping) { | ||||
|       const genColumn = currentMapping - lineStart; | ||||
|       const sourceColumn = sourceColumns[tokenIndex]; | ||||
|       maybeAddSegment(map, line, genColumn, filePath, line, sourceColumn); | ||||
|       while ( | ||||
|         (currentMapping === i || currentMapping === undefined) && | ||||
|         tokenIndex < rawMappings.length - 1 | ||||
|       ) { | ||||
|         tokenIndex++; | ||||
|         currentMapping = rawMappings[tokenIndex]; | ||||
|       } | ||||
|     } | ||||
|     if (generatedCode.charCodeAt(i) === charCodes.lineFeed) { | ||||
|       line++; | ||||
|       lineStart = i + 1; | ||||
|       if (currentMapping !== lineStart) { | ||||
|         maybeAddSegment(map, line, 0, filePath, line, 0); | ||||
|       } | ||||
|     } | ||||
|   } | ||||
|   const {sourceRoot, sourcesContent, ...sourceMap} = toEncodedMap(map); | ||||
|   return sourceMap ; | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Create an array mapping each token index to the 0-based column of the start | ||||
|  * position of the token. | ||||
|  */ | ||||
| function computeSourceColumns(code, tokens) { | ||||
|   const sourceColumns = new Array(tokens.length); | ||||
|   let tokenIndex = 0; | ||||
|   let currentMapping = tokens[tokenIndex].start; | ||||
|   let lineStart = 0; | ||||
|   for (let i = 0; i < code.length; i++) { | ||||
|     if (i === currentMapping) { | ||||
|       sourceColumns[tokenIndex] = currentMapping - lineStart; | ||||
|       tokenIndex++; | ||||
|       currentMapping = tokens[tokenIndex].start; | ||||
|     } | ||||
|     if (code.charCodeAt(i) === charCodes.lineFeed) { | ||||
|       lineStart = i + 1; | ||||
|     } | ||||
|   } | ||||
|   return sourceColumns; | ||||
| } | ||||
							
								
								
									
										98
									
								
								node_modules/sucrase/dist/esm/identifyShadowedGlobals.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										98
									
								
								node_modules/sucrase/dist/esm/identifyShadowedGlobals.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,98 @@ | |||
| import { | ||||
|   isBlockScopedDeclaration, | ||||
|   isFunctionScopedDeclaration, | ||||
|   isNonTopLevelDeclaration, | ||||
| } from "./parser/tokenizer"; | ||||
| 
 | ||||
| import {TokenType as tt} from "./parser/tokenizer/types"; | ||||
| 
 | ||||
| 
 | ||||
| /** | ||||
|  * Traverse the given tokens and modify them if necessary to indicate that some names shadow global | ||||
|  * variables. | ||||
|  */ | ||||
| export default function identifyShadowedGlobals( | ||||
|   tokens, | ||||
|   scopes, | ||||
|   globalNames, | ||||
| ) { | ||||
|   if (!hasShadowedGlobals(tokens, globalNames)) { | ||||
|     return; | ||||
|   } | ||||
|   markShadowedGlobals(tokens, scopes, globalNames); | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * We can do a fast up-front check to see if there are any declarations to global names. If not, | ||||
|  * then there's no point in computing scope assignments. | ||||
|  */ | ||||
| // Exported for testing.
 | ||||
| export function hasShadowedGlobals(tokens, globalNames) { | ||||
|   for (const token of tokens.tokens) { | ||||
|     if ( | ||||
|       token.type === tt.name && | ||||
|       !token.isType && | ||||
|       isNonTopLevelDeclaration(token) && | ||||
|       globalNames.has(tokens.identifierNameForToken(token)) | ||||
|     ) { | ||||
|       return true; | ||||
|     } | ||||
|   } | ||||
|   return false; | ||||
| } | ||||
| 
 | ||||
| function markShadowedGlobals( | ||||
|   tokens, | ||||
|   scopes, | ||||
|   globalNames, | ||||
| ) { | ||||
|   const scopeStack = []; | ||||
|   let scopeIndex = scopes.length - 1; | ||||
|   // Scopes were generated at completion time, so they're sorted by end index, so we can maintain a
 | ||||
|   // good stack by going backwards through them.
 | ||||
|   for (let i = tokens.tokens.length - 1; ; i--) { | ||||
|     while (scopeStack.length > 0 && scopeStack[scopeStack.length - 1].startTokenIndex === i + 1) { | ||||
|       scopeStack.pop(); | ||||
|     } | ||||
|     while (scopeIndex >= 0 && scopes[scopeIndex].endTokenIndex === i + 1) { | ||||
|       scopeStack.push(scopes[scopeIndex]); | ||||
|       scopeIndex--; | ||||
|     } | ||||
|     // Process scopes after the last iteration so we can make sure we pop all of them.
 | ||||
|     if (i < 0) { | ||||
|       break; | ||||
|     } | ||||
| 
 | ||||
|     const token = tokens.tokens[i]; | ||||
|     const name = tokens.identifierNameForToken(token); | ||||
|     if (scopeStack.length > 1 && !token.isType && token.type === tt.name && globalNames.has(name)) { | ||||
|       if (isBlockScopedDeclaration(token)) { | ||||
|         markShadowedForScope(scopeStack[scopeStack.length - 1], tokens, name); | ||||
|       } else if (isFunctionScopedDeclaration(token)) { | ||||
|         let stackIndex = scopeStack.length - 1; | ||||
|         while (stackIndex > 0 && !scopeStack[stackIndex].isFunctionScope) { | ||||
|           stackIndex--; | ||||
|         } | ||||
|         if (stackIndex < 0) { | ||||
|           throw new Error("Did not find parent function scope."); | ||||
|         } | ||||
|         markShadowedForScope(scopeStack[stackIndex], tokens, name); | ||||
|       } | ||||
|     } | ||||
|   } | ||||
|   if (scopeStack.length > 0) { | ||||
|     throw new Error("Expected empty scope stack after processing file."); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| function markShadowedForScope(scope, tokens, name) { | ||||
|   for (let i = scope.startTokenIndex; i < scope.endTokenIndex; i++) { | ||||
|     const token = tokens.tokens[i]; | ||||
|     if ( | ||||
|       (token.type === tt.name || token.type === tt.jsxName) && | ||||
|       tokens.identifierNameForToken(token) === name | ||||
|     ) { | ||||
|       token.shadowsGlobal = true; | ||||
|     } | ||||
|   } | ||||
| } | ||||
							
								
								
									
										133
									
								
								node_modules/sucrase/dist/esm/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										133
									
								
								node_modules/sucrase/dist/esm/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,133 @@ | |||
| import CJSImportProcessor from "./CJSImportProcessor"; | ||||
| import computeSourceMap, {} from "./computeSourceMap"; | ||||
| import {HelperManager} from "./HelperManager"; | ||||
| import identifyShadowedGlobals from "./identifyShadowedGlobals"; | ||||
| import NameManager from "./NameManager"; | ||||
| import {validateOptions} from "./Options"; | ||||
| 
 | ||||
| import {parse} from "./parser"; | ||||
| 
 | ||||
| import TokenProcessor from "./TokenProcessor"; | ||||
| import RootTransformer from "./transformers/RootTransformer"; | ||||
| import formatTokens from "./util/formatTokens"; | ||||
| import getTSImportedNames from "./util/getTSImportedNames"; | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| ; | ||||
| 
 | ||||
| export function getVersion() { | ||||
|   /* istanbul ignore next */ | ||||
|   return "3.34.0"; | ||||
| } | ||||
| 
 | ||||
| export function transform(code, options) { | ||||
|   validateOptions(options); | ||||
|   try { | ||||
|     const sucraseContext = getSucraseContext(code, options); | ||||
|     const transformer = new RootTransformer( | ||||
|       sucraseContext, | ||||
|       options.transforms, | ||||
|       Boolean(options.enableLegacyBabel5ModuleInterop), | ||||
|       options, | ||||
|     ); | ||||
|     const transformerResult = transformer.transform(); | ||||
|     let result = {code: transformerResult.code}; | ||||
|     if (options.sourceMapOptions) { | ||||
|       if (!options.filePath) { | ||||
|         throw new Error("filePath must be specified when generating a source map."); | ||||
|       } | ||||
|       result = { | ||||
|         ...result, | ||||
|         sourceMap: computeSourceMap( | ||||
|           transformerResult, | ||||
|           options.filePath, | ||||
|           options.sourceMapOptions, | ||||
|           code, | ||||
|           sucraseContext.tokenProcessor.tokens, | ||||
|         ), | ||||
|       }; | ||||
|     } | ||||
|     return result; | ||||
|     // eslint-disable-next-line @typescript-eslint/no-explicit-any
 | ||||
|   } catch (e) { | ||||
|     if (options.filePath) { | ||||
|       e.message = `Error transforming ${options.filePath}: ${e.message}`; | ||||
|     } | ||||
|     throw e; | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Return a string representation of the sucrase tokens, mostly useful for | ||||
|  * diagnostic purposes. | ||||
|  */ | ||||
| export function getFormattedTokens(code, options) { | ||||
|   const tokens = getSucraseContext(code, options).tokenProcessor.tokens; | ||||
|   return formatTokens(code, tokens); | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Call into the parser/tokenizer and do some further preprocessing: | ||||
|  * - Come up with a set of used names so that we can assign new names. | ||||
|  * - Preprocess all import/export statements so we know which globals we are interested in. | ||||
|  * - Compute situations where any of those globals are shadowed. | ||||
|  * | ||||
|  * In the future, some of these preprocessing steps can be skipped based on what actual work is | ||||
|  * being done. | ||||
|  */ | ||||
| function getSucraseContext(code, options) { | ||||
|   const isJSXEnabled = options.transforms.includes("jsx"); | ||||
|   const isTypeScriptEnabled = options.transforms.includes("typescript"); | ||||
|   const isFlowEnabled = options.transforms.includes("flow"); | ||||
|   const disableESTransforms = options.disableESTransforms === true; | ||||
|   const file = parse(code, isJSXEnabled, isTypeScriptEnabled, isFlowEnabled); | ||||
|   const tokens = file.tokens; | ||||
|   const scopes = file.scopes; | ||||
| 
 | ||||
|   const nameManager = new NameManager(code, tokens); | ||||
|   const helperManager = new HelperManager(nameManager); | ||||
|   const tokenProcessor = new TokenProcessor( | ||||
|     code, | ||||
|     tokens, | ||||
|     isFlowEnabled, | ||||
|     disableESTransforms, | ||||
|     helperManager, | ||||
|   ); | ||||
|   const enableLegacyTypeScriptModuleInterop = Boolean(options.enableLegacyTypeScriptModuleInterop); | ||||
| 
 | ||||
|   let importProcessor = null; | ||||
|   if (options.transforms.includes("imports")) { | ||||
|     importProcessor = new CJSImportProcessor( | ||||
|       nameManager, | ||||
|       tokenProcessor, | ||||
|       enableLegacyTypeScriptModuleInterop, | ||||
|       options, | ||||
|       options.transforms.includes("typescript"), | ||||
|       Boolean(options.keepUnusedImports), | ||||
|       helperManager, | ||||
|     ); | ||||
|     importProcessor.preprocessTokens(); | ||||
|     // We need to mark shadowed globals after processing imports so we know that the globals are,
 | ||||
|     // but before type-only import pruning, since that relies on shadowing information.
 | ||||
|     identifyShadowedGlobals(tokenProcessor, scopes, importProcessor.getGlobalNames()); | ||||
|     if (options.transforms.includes("typescript") && !options.keepUnusedImports) { | ||||
|       importProcessor.pruneTypeOnlyImports(); | ||||
|     } | ||||
|   } else if (options.transforms.includes("typescript") && !options.keepUnusedImports) { | ||||
|     // Shadowed global detection is needed for TS implicit elision of imported names.
 | ||||
|     identifyShadowedGlobals(tokenProcessor, scopes, getTSImportedNames(tokenProcessor)); | ||||
|   } | ||||
|   return {tokenProcessor, scopes, nameManager, importProcessor, helperManager}; | ||||
| } | ||||
							
								
								
									
										31
									
								
								node_modules/sucrase/dist/esm/parser/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										31
									
								
								node_modules/sucrase/dist/esm/parser/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,31 @@ | |||
| 
 | ||||
| 
 | ||||
| import {augmentError, initParser, state} from "./traverser/base"; | ||||
| import {parseFile} from "./traverser/index"; | ||||
| 
 | ||||
| export class File { | ||||
|    | ||||
|    | ||||
| 
 | ||||
|   constructor(tokens, scopes) { | ||||
|     this.tokens = tokens; | ||||
|     this.scopes = scopes; | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| export function parse( | ||||
|   input, | ||||
|   isJSXEnabled, | ||||
|   isTypeScriptEnabled, | ||||
|   isFlowEnabled, | ||||
| ) { | ||||
|   if (isFlowEnabled && isTypeScriptEnabled) { | ||||
|     throw new Error("Cannot combine flow and typescript plugins."); | ||||
|   } | ||||
|   initParser(input, isJSXEnabled, isTypeScriptEnabled, isFlowEnabled); | ||||
|   const result = parseFile(); | ||||
|   if (state.error) { | ||||
|     throw augmentError(state.error); | ||||
|   } | ||||
|   return result; | ||||
| } | ||||
							
								
								
									
										1105
									
								
								node_modules/sucrase/dist/esm/parser/plugins/flow.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										1105
									
								
								node_modules/sucrase/dist/esm/parser/plugins/flow.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
										
											
												File diff suppressed because it is too large
												Load diff
											
										
									
								
							
							
								
								
									
										367
									
								
								node_modules/sucrase/dist/esm/parser/plugins/jsx/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										367
									
								
								node_modules/sucrase/dist/esm/parser/plugins/jsx/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,367 @@ | |||
| import { | ||||
|   eat, | ||||
|   finishToken, | ||||
|   getTokenFromCode, | ||||
|   IdentifierRole, | ||||
|   JSXRole, | ||||
|   match, | ||||
|   next, | ||||
|   skipSpace, | ||||
|   Token, | ||||
| } from "../../tokenizer/index"; | ||||
| import {TokenType as tt} from "../../tokenizer/types"; | ||||
| import {input, isTypeScriptEnabled, state} from "../../traverser/base"; | ||||
| import {parseExpression, parseMaybeAssign} from "../../traverser/expression"; | ||||
| import {expect, unexpected} from "../../traverser/util"; | ||||
| import {charCodes} from "../../util/charcodes"; | ||||
| import {IS_IDENTIFIER_CHAR, IS_IDENTIFIER_START} from "../../util/identifier"; | ||||
| import {tsTryParseJSXTypeArgument} from "../typescript"; | ||||
| 
 | ||||
| /** | ||||
|  * Read token with JSX contents. | ||||
|  * | ||||
|  * In addition to detecting jsxTagStart and also regular tokens that might be | ||||
|  * part of an expression, this code detects the start and end of text ranges | ||||
|  * within JSX children. In order to properly count the number of children, we | ||||
|  * distinguish jsxText from jsxEmptyText, which is a text range that simplifies | ||||
|  * to the empty string after JSX whitespace trimming. | ||||
|  * | ||||
|  * It turns out that a JSX text range will simplify to the empty string if and | ||||
|  * only if both of these conditions hold: | ||||
|  * - The range consists entirely of whitespace characters (only counting space, | ||||
|  *   tab, \r, and \n). | ||||
|  * - The range has at least one newline. | ||||
|  * This can be proven by analyzing any implementation of whitespace trimming, | ||||
|  * e.g. formatJSXTextLiteral in Sucrase or cleanJSXElementLiteralChild in Babel. | ||||
|  */ | ||||
| function jsxReadToken() { | ||||
|   let sawNewline = false; | ||||
|   let sawNonWhitespace = false; | ||||
|   while (true) { | ||||
|     if (state.pos >= input.length) { | ||||
|       unexpected("Unterminated JSX contents"); | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     const ch = input.charCodeAt(state.pos); | ||||
|     if (ch === charCodes.lessThan || ch === charCodes.leftCurlyBrace) { | ||||
|       if (state.pos === state.start) { | ||||
|         if (ch === charCodes.lessThan) { | ||||
|           state.pos++; | ||||
|           finishToken(tt.jsxTagStart); | ||||
|           return; | ||||
|         } | ||||
|         getTokenFromCode(ch); | ||||
|         return; | ||||
|       } | ||||
|       if (sawNewline && !sawNonWhitespace) { | ||||
|         finishToken(tt.jsxEmptyText); | ||||
|       } else { | ||||
|         finishToken(tt.jsxText); | ||||
|       } | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     // This is part of JSX text.
 | ||||
|     if (ch === charCodes.lineFeed) { | ||||
|       sawNewline = true; | ||||
|     } else if (ch !== charCodes.space && ch !== charCodes.carriageReturn && ch !== charCodes.tab) { | ||||
|       sawNonWhitespace = true; | ||||
|     } | ||||
|     state.pos++; | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| function jsxReadString(quote) { | ||||
|   state.pos++; | ||||
|   for (;;) { | ||||
|     if (state.pos >= input.length) { | ||||
|       unexpected("Unterminated string constant"); | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     const ch = input.charCodeAt(state.pos); | ||||
|     if (ch === quote) { | ||||
|       state.pos++; | ||||
|       break; | ||||
|     } | ||||
|     state.pos++; | ||||
|   } | ||||
|   finishToken(tt.string); | ||||
| } | ||||
| 
 | ||||
| // Read a JSX identifier (valid tag or attribute name).
 | ||||
| //
 | ||||
| // Optimized version since JSX identifiers can't contain
 | ||||
| // escape characters and so can be read as single slice.
 | ||||
| // Also assumes that first character was already checked
 | ||||
| // by isIdentifierStart in readToken.
 | ||||
| 
 | ||||
| function jsxReadWord() { | ||||
|   let ch; | ||||
|   do { | ||||
|     if (state.pos > input.length) { | ||||
|       unexpected("Unexpectedly reached the end of input."); | ||||
|       return; | ||||
|     } | ||||
|     ch = input.charCodeAt(++state.pos); | ||||
|   } while (IS_IDENTIFIER_CHAR[ch] || ch === charCodes.dash); | ||||
|   finishToken(tt.jsxName); | ||||
| } | ||||
| 
 | ||||
| // Parse next token as JSX identifier
 | ||||
| function jsxParseIdentifier() { | ||||
|   nextJSXTagToken(); | ||||
| } | ||||
| 
 | ||||
| // Parse namespaced identifier.
 | ||||
| function jsxParseNamespacedName(identifierRole) { | ||||
|   jsxParseIdentifier(); | ||||
|   if (!eat(tt.colon)) { | ||||
|     // Plain identifier, so this is an access.
 | ||||
|     state.tokens[state.tokens.length - 1].identifierRole = identifierRole; | ||||
|     return; | ||||
|   } | ||||
|   // Process the second half of the namespaced name.
 | ||||
|   jsxParseIdentifier(); | ||||
| } | ||||
| 
 | ||||
| // Parses element name in any form - namespaced, member
 | ||||
| // or single identifier.
 | ||||
| function jsxParseElementName() { | ||||
|   const firstTokenIndex = state.tokens.length; | ||||
|   jsxParseNamespacedName(IdentifierRole.Access); | ||||
|   let hadDot = false; | ||||
|   while (match(tt.dot)) { | ||||
|     hadDot = true; | ||||
|     nextJSXTagToken(); | ||||
|     jsxParseIdentifier(); | ||||
|   } | ||||
|   // For tags like <div> with a lowercase letter and no dots, the name is
 | ||||
|   // actually *not* an identifier access, since it's referring to a built-in
 | ||||
|   // tag name. Remove the identifier role in this case so that it's not
 | ||||
|   // accidentally transformed by the imports transform when preserving JSX.
 | ||||
|   if (!hadDot) { | ||||
|     const firstToken = state.tokens[firstTokenIndex]; | ||||
|     const firstChar = input.charCodeAt(firstToken.start); | ||||
|     if (firstChar >= charCodes.lowercaseA && firstChar <= charCodes.lowercaseZ) { | ||||
|       firstToken.identifierRole = null; | ||||
|     } | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| // Parses any type of JSX attribute value.
 | ||||
| function jsxParseAttributeValue() { | ||||
|   switch (state.type) { | ||||
|     case tt.braceL: | ||||
|       next(); | ||||
|       parseExpression(); | ||||
|       nextJSXTagToken(); | ||||
|       return; | ||||
| 
 | ||||
|     case tt.jsxTagStart: | ||||
|       jsxParseElement(); | ||||
|       nextJSXTagToken(); | ||||
|       return; | ||||
| 
 | ||||
|     case tt.string: | ||||
|       nextJSXTagToken(); | ||||
|       return; | ||||
| 
 | ||||
|     default: | ||||
|       unexpected("JSX value should be either an expression or a quoted JSX text"); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| // Parse JSX spread child, after already processing the {
 | ||||
| // Does not parse the closing }
 | ||||
| function jsxParseSpreadChild() { | ||||
|   expect(tt.ellipsis); | ||||
|   parseExpression(); | ||||
| } | ||||
| 
 | ||||
| // Parses JSX opening tag starting after "<".
 | ||||
| // Returns true if the tag was self-closing.
 | ||||
| // Does not parse the last token.
 | ||||
| function jsxParseOpeningElement(initialTokenIndex) { | ||||
|   if (match(tt.jsxTagEnd)) { | ||||
|     // This is an open-fragment.
 | ||||
|     return false; | ||||
|   } | ||||
|   jsxParseElementName(); | ||||
|   if (isTypeScriptEnabled) { | ||||
|     tsTryParseJSXTypeArgument(); | ||||
|   } | ||||
|   let hasSeenPropSpread = false; | ||||
|   while (!match(tt.slash) && !match(tt.jsxTagEnd) && !state.error) { | ||||
|     if (eat(tt.braceL)) { | ||||
|       hasSeenPropSpread = true; | ||||
|       expect(tt.ellipsis); | ||||
|       parseMaybeAssign(); | ||||
|       // }
 | ||||
|       nextJSXTagToken(); | ||||
|       continue; | ||||
|     } | ||||
|     if ( | ||||
|       hasSeenPropSpread && | ||||
|       state.end - state.start === 3 && | ||||
|       input.charCodeAt(state.start) === charCodes.lowercaseK && | ||||
|       input.charCodeAt(state.start + 1) === charCodes.lowercaseE && | ||||
|       input.charCodeAt(state.start + 2) === charCodes.lowercaseY | ||||
|     ) { | ||||
|       state.tokens[initialTokenIndex].jsxRole = JSXRole.KeyAfterPropSpread; | ||||
|     } | ||||
|     jsxParseNamespacedName(IdentifierRole.ObjectKey); | ||||
|     if (match(tt.eq)) { | ||||
|       nextJSXTagToken(); | ||||
|       jsxParseAttributeValue(); | ||||
|     } | ||||
|   } | ||||
|   const isSelfClosing = match(tt.slash); | ||||
|   if (isSelfClosing) { | ||||
|     // /
 | ||||
|     nextJSXTagToken(); | ||||
|   } | ||||
|   return isSelfClosing; | ||||
| } | ||||
| 
 | ||||
| // Parses JSX closing tag starting after "</".
 | ||||
| // Does not parse the last token.
 | ||||
| function jsxParseClosingElement() { | ||||
|   if (match(tt.jsxTagEnd)) { | ||||
|     // Fragment syntax, so we immediately have a tag end.
 | ||||
|     return; | ||||
|   } | ||||
|   jsxParseElementName(); | ||||
| } | ||||
| 
 | ||||
| // Parses entire JSX element, including its opening tag
 | ||||
| // (starting after "<"), attributes, contents and closing tag.
 | ||||
| // Does not parse the last token.
 | ||||
| function jsxParseElementAt() { | ||||
|   const initialTokenIndex = state.tokens.length - 1; | ||||
|   state.tokens[initialTokenIndex].jsxRole = JSXRole.NoChildren; | ||||
|   let numExplicitChildren = 0; | ||||
|   const isSelfClosing = jsxParseOpeningElement(initialTokenIndex); | ||||
|   if (!isSelfClosing) { | ||||
|     nextJSXExprToken(); | ||||
|     while (true) { | ||||
|       switch (state.type) { | ||||
|         case tt.jsxTagStart: | ||||
|           nextJSXTagToken(); | ||||
|           if (match(tt.slash)) { | ||||
|             nextJSXTagToken(); | ||||
|             jsxParseClosingElement(); | ||||
|             // Key after prop spread takes precedence over number of children,
 | ||||
|             // since it means we switch to createElement, which doesn't care
 | ||||
|             // about number of children.
 | ||||
|             if (state.tokens[initialTokenIndex].jsxRole !== JSXRole.KeyAfterPropSpread) { | ||||
|               if (numExplicitChildren === 1) { | ||||
|                 state.tokens[initialTokenIndex].jsxRole = JSXRole.OneChild; | ||||
|               } else if (numExplicitChildren > 1) { | ||||
|                 state.tokens[initialTokenIndex].jsxRole = JSXRole.StaticChildren; | ||||
|               } | ||||
|             } | ||||
|             return; | ||||
|           } | ||||
|           numExplicitChildren++; | ||||
|           jsxParseElementAt(); | ||||
|           nextJSXExprToken(); | ||||
|           break; | ||||
| 
 | ||||
|         case tt.jsxText: | ||||
|           numExplicitChildren++; | ||||
|           nextJSXExprToken(); | ||||
|           break; | ||||
| 
 | ||||
|         case tt.jsxEmptyText: | ||||
|           nextJSXExprToken(); | ||||
|           break; | ||||
| 
 | ||||
|         case tt.braceL: | ||||
|           next(); | ||||
|           if (match(tt.ellipsis)) { | ||||
|             jsxParseSpreadChild(); | ||||
|             nextJSXExprToken(); | ||||
|             // Spread children are a mechanism to explicitly mark children as
 | ||||
|             // static, so count it as 2 children to satisfy the "more than one
 | ||||
|             // child" condition.
 | ||||
|             numExplicitChildren += 2; | ||||
|           } else { | ||||
|             // If we see {}, this is an empty pseudo-expression that doesn't
 | ||||
|             // count as a child.
 | ||||
|             if (!match(tt.braceR)) { | ||||
|               numExplicitChildren++; | ||||
|               parseExpression(); | ||||
|             } | ||||
|             nextJSXExprToken(); | ||||
|           } | ||||
| 
 | ||||
|           break; | ||||
| 
 | ||||
|         // istanbul ignore next - should never happen
 | ||||
|         default: | ||||
|           unexpected(); | ||||
|           return; | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| // Parses entire JSX element from current position.
 | ||||
| // Does not parse the last token.
 | ||||
| export function jsxParseElement() { | ||||
|   nextJSXTagToken(); | ||||
|   jsxParseElementAt(); | ||||
| } | ||||
| 
 | ||||
| // ==================================
 | ||||
| // Overrides
 | ||||
| // ==================================
 | ||||
| 
 | ||||
| export function nextJSXTagToken() { | ||||
|   state.tokens.push(new Token()); | ||||
|   skipSpace(); | ||||
|   state.start = state.pos; | ||||
|   const code = input.charCodeAt(state.pos); | ||||
| 
 | ||||
|   if (IS_IDENTIFIER_START[code]) { | ||||
|     jsxReadWord(); | ||||
|   } else if (code === charCodes.quotationMark || code === charCodes.apostrophe) { | ||||
|     jsxReadString(code); | ||||
|   } else { | ||||
|     // The following tokens are just one character each.
 | ||||
|     ++state.pos; | ||||
|     switch (code) { | ||||
|       case charCodes.greaterThan: | ||||
|         finishToken(tt.jsxTagEnd); | ||||
|         break; | ||||
|       case charCodes.lessThan: | ||||
|         finishToken(tt.jsxTagStart); | ||||
|         break; | ||||
|       case charCodes.slash: | ||||
|         finishToken(tt.slash); | ||||
|         break; | ||||
|       case charCodes.equalsTo: | ||||
|         finishToken(tt.eq); | ||||
|         break; | ||||
|       case charCodes.leftCurlyBrace: | ||||
|         finishToken(tt.braceL); | ||||
|         break; | ||||
|       case charCodes.dot: | ||||
|         finishToken(tt.dot); | ||||
|         break; | ||||
|       case charCodes.colon: | ||||
|         finishToken(tt.colon); | ||||
|         break; | ||||
|       default: | ||||
|         unexpected(); | ||||
|     } | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| function nextJSXExprToken() { | ||||
|   state.tokens.push(new Token()); | ||||
|   state.start = state.pos; | ||||
|   jsxReadToken(); | ||||
| } | ||||
							
								
								
									
										256
									
								
								node_modules/sucrase/dist/esm/parser/plugins/jsx/xhtml.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										256
									
								
								node_modules/sucrase/dist/esm/parser/plugins/jsx/xhtml.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,256 @@ | |||
| // Use a Map rather than object to avoid unexpected __proto__ access.
 | ||||
| export default new Map([ | ||||
|   ["quot", "\u0022"], | ||||
|   ["amp", "&"], | ||||
|   ["apos", "\u0027"], | ||||
|   ["lt", "<"], | ||||
|   ["gt", ">"], | ||||
|   ["nbsp", "\u00A0"], | ||||
|   ["iexcl", "\u00A1"], | ||||
|   ["cent", "\u00A2"], | ||||
|   ["pound", "\u00A3"], | ||||
|   ["curren", "\u00A4"], | ||||
|   ["yen", "\u00A5"], | ||||
|   ["brvbar", "\u00A6"], | ||||
|   ["sect", "\u00A7"], | ||||
|   ["uml", "\u00A8"], | ||||
|   ["copy", "\u00A9"], | ||||
|   ["ordf", "\u00AA"], | ||||
|   ["laquo", "\u00AB"], | ||||
|   ["not", "\u00AC"], | ||||
|   ["shy", "\u00AD"], | ||||
|   ["reg", "\u00AE"], | ||||
|   ["macr", "\u00AF"], | ||||
|   ["deg", "\u00B0"], | ||||
|   ["plusmn", "\u00B1"], | ||||
|   ["sup2", "\u00B2"], | ||||
|   ["sup3", "\u00B3"], | ||||
|   ["acute", "\u00B4"], | ||||
|   ["micro", "\u00B5"], | ||||
|   ["para", "\u00B6"], | ||||
|   ["middot", "\u00B7"], | ||||
|   ["cedil", "\u00B8"], | ||||
|   ["sup1", "\u00B9"], | ||||
|   ["ordm", "\u00BA"], | ||||
|   ["raquo", "\u00BB"], | ||||
|   ["frac14", "\u00BC"], | ||||
|   ["frac12", "\u00BD"], | ||||
|   ["frac34", "\u00BE"], | ||||
|   ["iquest", "\u00BF"], | ||||
|   ["Agrave", "\u00C0"], | ||||
|   ["Aacute", "\u00C1"], | ||||
|   ["Acirc", "\u00C2"], | ||||
|   ["Atilde", "\u00C3"], | ||||
|   ["Auml", "\u00C4"], | ||||
|   ["Aring", "\u00C5"], | ||||
|   ["AElig", "\u00C6"], | ||||
|   ["Ccedil", "\u00C7"], | ||||
|   ["Egrave", "\u00C8"], | ||||
|   ["Eacute", "\u00C9"], | ||||
|   ["Ecirc", "\u00CA"], | ||||
|   ["Euml", "\u00CB"], | ||||
|   ["Igrave", "\u00CC"], | ||||
|   ["Iacute", "\u00CD"], | ||||
|   ["Icirc", "\u00CE"], | ||||
|   ["Iuml", "\u00CF"], | ||||
|   ["ETH", "\u00D0"], | ||||
|   ["Ntilde", "\u00D1"], | ||||
|   ["Ograve", "\u00D2"], | ||||
|   ["Oacute", "\u00D3"], | ||||
|   ["Ocirc", "\u00D4"], | ||||
|   ["Otilde", "\u00D5"], | ||||
|   ["Ouml", "\u00D6"], | ||||
|   ["times", "\u00D7"], | ||||
|   ["Oslash", "\u00D8"], | ||||
|   ["Ugrave", "\u00D9"], | ||||
|   ["Uacute", "\u00DA"], | ||||
|   ["Ucirc", "\u00DB"], | ||||
|   ["Uuml", "\u00DC"], | ||||
|   ["Yacute", "\u00DD"], | ||||
|   ["THORN", "\u00DE"], | ||||
|   ["szlig", "\u00DF"], | ||||
|   ["agrave", "\u00E0"], | ||||
|   ["aacute", "\u00E1"], | ||||
|   ["acirc", "\u00E2"], | ||||
|   ["atilde", "\u00E3"], | ||||
|   ["auml", "\u00E4"], | ||||
|   ["aring", "\u00E5"], | ||||
|   ["aelig", "\u00E6"], | ||||
|   ["ccedil", "\u00E7"], | ||||
|   ["egrave", "\u00E8"], | ||||
|   ["eacute", "\u00E9"], | ||||
|   ["ecirc", "\u00EA"], | ||||
|   ["euml", "\u00EB"], | ||||
|   ["igrave", "\u00EC"], | ||||
|   ["iacute", "\u00ED"], | ||||
|   ["icirc", "\u00EE"], | ||||
|   ["iuml", "\u00EF"], | ||||
|   ["eth", "\u00F0"], | ||||
|   ["ntilde", "\u00F1"], | ||||
|   ["ograve", "\u00F2"], | ||||
|   ["oacute", "\u00F3"], | ||||
|   ["ocirc", "\u00F4"], | ||||
|   ["otilde", "\u00F5"], | ||||
|   ["ouml", "\u00F6"], | ||||
|   ["divide", "\u00F7"], | ||||
|   ["oslash", "\u00F8"], | ||||
|   ["ugrave", "\u00F9"], | ||||
|   ["uacute", "\u00FA"], | ||||
|   ["ucirc", "\u00FB"], | ||||
|   ["uuml", "\u00FC"], | ||||
|   ["yacute", "\u00FD"], | ||||
|   ["thorn", "\u00FE"], | ||||
|   ["yuml", "\u00FF"], | ||||
|   ["OElig", "\u0152"], | ||||
|   ["oelig", "\u0153"], | ||||
|   ["Scaron", "\u0160"], | ||||
|   ["scaron", "\u0161"], | ||||
|   ["Yuml", "\u0178"], | ||||
|   ["fnof", "\u0192"], | ||||
|   ["circ", "\u02C6"], | ||||
|   ["tilde", "\u02DC"], | ||||
|   ["Alpha", "\u0391"], | ||||
|   ["Beta", "\u0392"], | ||||
|   ["Gamma", "\u0393"], | ||||
|   ["Delta", "\u0394"], | ||||
|   ["Epsilon", "\u0395"], | ||||
|   ["Zeta", "\u0396"], | ||||
|   ["Eta", "\u0397"], | ||||
|   ["Theta", "\u0398"], | ||||
|   ["Iota", "\u0399"], | ||||
|   ["Kappa", "\u039A"], | ||||
|   ["Lambda", "\u039B"], | ||||
|   ["Mu", "\u039C"], | ||||
|   ["Nu", "\u039D"], | ||||
|   ["Xi", "\u039E"], | ||||
|   ["Omicron", "\u039F"], | ||||
|   ["Pi", "\u03A0"], | ||||
|   ["Rho", "\u03A1"], | ||||
|   ["Sigma", "\u03A3"], | ||||
|   ["Tau", "\u03A4"], | ||||
|   ["Upsilon", "\u03A5"], | ||||
|   ["Phi", "\u03A6"], | ||||
|   ["Chi", "\u03A7"], | ||||
|   ["Psi", "\u03A8"], | ||||
|   ["Omega", "\u03A9"], | ||||
|   ["alpha", "\u03B1"], | ||||
|   ["beta", "\u03B2"], | ||||
|   ["gamma", "\u03B3"], | ||||
|   ["delta", "\u03B4"], | ||||
|   ["epsilon", "\u03B5"], | ||||
|   ["zeta", "\u03B6"], | ||||
|   ["eta", "\u03B7"], | ||||
|   ["theta", "\u03B8"], | ||||
|   ["iota", "\u03B9"], | ||||
|   ["kappa", "\u03BA"], | ||||
|   ["lambda", "\u03BB"], | ||||
|   ["mu", "\u03BC"], | ||||
|   ["nu", "\u03BD"], | ||||
|   ["xi", "\u03BE"], | ||||
|   ["omicron", "\u03BF"], | ||||
|   ["pi", "\u03C0"], | ||||
|   ["rho", "\u03C1"], | ||||
|   ["sigmaf", "\u03C2"], | ||||
|   ["sigma", "\u03C3"], | ||||
|   ["tau", "\u03C4"], | ||||
|   ["upsilon", "\u03C5"], | ||||
|   ["phi", "\u03C6"], | ||||
|   ["chi", "\u03C7"], | ||||
|   ["psi", "\u03C8"], | ||||
|   ["omega", "\u03C9"], | ||||
|   ["thetasym", "\u03D1"], | ||||
|   ["upsih", "\u03D2"], | ||||
|   ["piv", "\u03D6"], | ||||
|   ["ensp", "\u2002"], | ||||
|   ["emsp", "\u2003"], | ||||
|   ["thinsp", "\u2009"], | ||||
|   ["zwnj", "\u200C"], | ||||
|   ["zwj", "\u200D"], | ||||
|   ["lrm", "\u200E"], | ||||
|   ["rlm", "\u200F"], | ||||
|   ["ndash", "\u2013"], | ||||
|   ["mdash", "\u2014"], | ||||
|   ["lsquo", "\u2018"], | ||||
|   ["rsquo", "\u2019"], | ||||
|   ["sbquo", "\u201A"], | ||||
|   ["ldquo", "\u201C"], | ||||
|   ["rdquo", "\u201D"], | ||||
|   ["bdquo", "\u201E"], | ||||
|   ["dagger", "\u2020"], | ||||
|   ["Dagger", "\u2021"], | ||||
|   ["bull", "\u2022"], | ||||
|   ["hellip", "\u2026"], | ||||
|   ["permil", "\u2030"], | ||||
|   ["prime", "\u2032"], | ||||
|   ["Prime", "\u2033"], | ||||
|   ["lsaquo", "\u2039"], | ||||
|   ["rsaquo", "\u203A"], | ||||
|   ["oline", "\u203E"], | ||||
|   ["frasl", "\u2044"], | ||||
|   ["euro", "\u20AC"], | ||||
|   ["image", "\u2111"], | ||||
|   ["weierp", "\u2118"], | ||||
|   ["real", "\u211C"], | ||||
|   ["trade", "\u2122"], | ||||
|   ["alefsym", "\u2135"], | ||||
|   ["larr", "\u2190"], | ||||
|   ["uarr", "\u2191"], | ||||
|   ["rarr", "\u2192"], | ||||
|   ["darr", "\u2193"], | ||||
|   ["harr", "\u2194"], | ||||
|   ["crarr", "\u21B5"], | ||||
|   ["lArr", "\u21D0"], | ||||
|   ["uArr", "\u21D1"], | ||||
|   ["rArr", "\u21D2"], | ||||
|   ["dArr", "\u21D3"], | ||||
|   ["hArr", "\u21D4"], | ||||
|   ["forall", "\u2200"], | ||||
|   ["part", "\u2202"], | ||||
|   ["exist", "\u2203"], | ||||
|   ["empty", "\u2205"], | ||||
|   ["nabla", "\u2207"], | ||||
|   ["isin", "\u2208"], | ||||
|   ["notin", "\u2209"], | ||||
|   ["ni", "\u220B"], | ||||
|   ["prod", "\u220F"], | ||||
|   ["sum", "\u2211"], | ||||
|   ["minus", "\u2212"], | ||||
|   ["lowast", "\u2217"], | ||||
|   ["radic", "\u221A"], | ||||
|   ["prop", "\u221D"], | ||||
|   ["infin", "\u221E"], | ||||
|   ["ang", "\u2220"], | ||||
|   ["and", "\u2227"], | ||||
|   ["or", "\u2228"], | ||||
|   ["cap", "\u2229"], | ||||
|   ["cup", "\u222A"], | ||||
|   ["int", "\u222B"], | ||||
|   ["there4", "\u2234"], | ||||
|   ["sim", "\u223C"], | ||||
|   ["cong", "\u2245"], | ||||
|   ["asymp", "\u2248"], | ||||
|   ["ne", "\u2260"], | ||||
|   ["equiv", "\u2261"], | ||||
|   ["le", "\u2264"], | ||||
|   ["ge", "\u2265"], | ||||
|   ["sub", "\u2282"], | ||||
|   ["sup", "\u2283"], | ||||
|   ["nsub", "\u2284"], | ||||
|   ["sube", "\u2286"], | ||||
|   ["supe", "\u2287"], | ||||
|   ["oplus", "\u2295"], | ||||
|   ["otimes", "\u2297"], | ||||
|   ["perp", "\u22A5"], | ||||
|   ["sdot", "\u22C5"], | ||||
|   ["lceil", "\u2308"], | ||||
|   ["rceil", "\u2309"], | ||||
|   ["lfloor", "\u230A"], | ||||
|   ["rfloor", "\u230B"], | ||||
|   ["lang", "\u2329"], | ||||
|   ["rang", "\u232A"], | ||||
|   ["loz", "\u25CA"], | ||||
|   ["spades", "\u2660"], | ||||
|   ["clubs", "\u2663"], | ||||
|   ["hearts", "\u2665"], | ||||
|   ["diams", "\u2666"], | ||||
| ]); | ||||
							
								
								
									
										37
									
								
								node_modules/sucrase/dist/esm/parser/plugins/types.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										37
									
								
								node_modules/sucrase/dist/esm/parser/plugins/types.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,37 @@ | |||
| import {eatTypeToken, lookaheadType, match} from "../tokenizer/index"; | ||||
| import {TokenType as tt} from "../tokenizer/types"; | ||||
| import {isFlowEnabled, isTypeScriptEnabled} from "../traverser/base"; | ||||
| import {baseParseConditional} from "../traverser/expression"; | ||||
| import {flowParseTypeAnnotation} from "./flow"; | ||||
| import {tsParseTypeAnnotation} from "./typescript"; | ||||
| 
 | ||||
| /** | ||||
|  * Common parser code for TypeScript and Flow. | ||||
|  */ | ||||
| 
 | ||||
| // An apparent conditional expression could actually be an optional parameter in an arrow function.
 | ||||
| export function typedParseConditional(noIn) { | ||||
|   // If we see ?:, this can't possibly be a valid conditional. typedParseParenItem will be called
 | ||||
|   // later to finish off the arrow parameter. We also need to handle bare ? tokens for optional
 | ||||
|   // parameters without type annotations, i.e. ?, and ?) .
 | ||||
|   if (match(tt.question)) { | ||||
|     const nextType = lookaheadType(); | ||||
|     if (nextType === tt.colon || nextType === tt.comma || nextType === tt.parenR) { | ||||
|       return; | ||||
|     } | ||||
|   } | ||||
|   baseParseConditional(noIn); | ||||
| } | ||||
| 
 | ||||
| // Note: These "type casts" are *not* valid TS expressions.
 | ||||
| // But we parse them here and change them when completing the arrow function.
 | ||||
| export function typedParseParenItem() { | ||||
|   eatTypeToken(tt.question); | ||||
|   if (match(tt.colon)) { | ||||
|     if (isTypeScriptEnabled) { | ||||
|       tsParseTypeAnnotation(); | ||||
|     } else if (isFlowEnabled) { | ||||
|       flowParseTypeAnnotation(); | ||||
|     } | ||||
|   } | ||||
| } | ||||
							
								
								
									
										1632
									
								
								node_modules/sucrase/dist/esm/parser/plugins/typescript.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										1632
									
								
								node_modules/sucrase/dist/esm/parser/plugins/typescript.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
										
											
												File diff suppressed because it is too large
												Load diff
											
										
									
								
							
							
								
								
									
										1004
									
								
								node_modules/sucrase/dist/esm/parser/tokenizer/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										1004
									
								
								node_modules/sucrase/dist/esm/parser/tokenizer/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
										
											
												File diff suppressed because it is too large
												Load diff
											
										
									
								
							
							
								
								
									
										43
									
								
								node_modules/sucrase/dist/esm/parser/tokenizer/keywords.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										43
									
								
								node_modules/sucrase/dist/esm/parser/tokenizer/keywords.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,43 @@ | |||
| export var ContextualKeyword; (function (ContextualKeyword) { | ||||
|   const NONE = 0; ContextualKeyword[ContextualKeyword["NONE"] = NONE] = "NONE"; | ||||
|   const _abstract = NONE + 1; ContextualKeyword[ContextualKeyword["_abstract"] = _abstract] = "_abstract"; | ||||
|   const _accessor = _abstract + 1; ContextualKeyword[ContextualKeyword["_accessor"] = _accessor] = "_accessor"; | ||||
|   const _as = _accessor + 1; ContextualKeyword[ContextualKeyword["_as"] = _as] = "_as"; | ||||
|   const _assert = _as + 1; ContextualKeyword[ContextualKeyword["_assert"] = _assert] = "_assert"; | ||||
|   const _asserts = _assert + 1; ContextualKeyword[ContextualKeyword["_asserts"] = _asserts] = "_asserts"; | ||||
|   const _async = _asserts + 1; ContextualKeyword[ContextualKeyword["_async"] = _async] = "_async"; | ||||
|   const _await = _async + 1; ContextualKeyword[ContextualKeyword["_await"] = _await] = "_await"; | ||||
|   const _checks = _await + 1; ContextualKeyword[ContextualKeyword["_checks"] = _checks] = "_checks"; | ||||
|   const _constructor = _checks + 1; ContextualKeyword[ContextualKeyword["_constructor"] = _constructor] = "_constructor"; | ||||
|   const _declare = _constructor + 1; ContextualKeyword[ContextualKeyword["_declare"] = _declare] = "_declare"; | ||||
|   const _enum = _declare + 1; ContextualKeyword[ContextualKeyword["_enum"] = _enum] = "_enum"; | ||||
|   const _exports = _enum + 1; ContextualKeyword[ContextualKeyword["_exports"] = _exports] = "_exports"; | ||||
|   const _from = _exports + 1; ContextualKeyword[ContextualKeyword["_from"] = _from] = "_from"; | ||||
|   const _get = _from + 1; ContextualKeyword[ContextualKeyword["_get"] = _get] = "_get"; | ||||
|   const _global = _get + 1; ContextualKeyword[ContextualKeyword["_global"] = _global] = "_global"; | ||||
|   const _implements = _global + 1; ContextualKeyword[ContextualKeyword["_implements"] = _implements] = "_implements"; | ||||
|   const _infer = _implements + 1; ContextualKeyword[ContextualKeyword["_infer"] = _infer] = "_infer"; | ||||
|   const _interface = _infer + 1; ContextualKeyword[ContextualKeyword["_interface"] = _interface] = "_interface"; | ||||
|   const _is = _interface + 1; ContextualKeyword[ContextualKeyword["_is"] = _is] = "_is"; | ||||
|   const _keyof = _is + 1; ContextualKeyword[ContextualKeyword["_keyof"] = _keyof] = "_keyof"; | ||||
|   const _mixins = _keyof + 1; ContextualKeyword[ContextualKeyword["_mixins"] = _mixins] = "_mixins"; | ||||
|   const _module = _mixins + 1; ContextualKeyword[ContextualKeyword["_module"] = _module] = "_module"; | ||||
|   const _namespace = _module + 1; ContextualKeyword[ContextualKeyword["_namespace"] = _namespace] = "_namespace"; | ||||
|   const _of = _namespace + 1; ContextualKeyword[ContextualKeyword["_of"] = _of] = "_of"; | ||||
|   const _opaque = _of + 1; ContextualKeyword[ContextualKeyword["_opaque"] = _opaque] = "_opaque"; | ||||
|   const _out = _opaque + 1; ContextualKeyword[ContextualKeyword["_out"] = _out] = "_out"; | ||||
|   const _override = _out + 1; ContextualKeyword[ContextualKeyword["_override"] = _override] = "_override"; | ||||
|   const _private = _override + 1; ContextualKeyword[ContextualKeyword["_private"] = _private] = "_private"; | ||||
|   const _protected = _private + 1; ContextualKeyword[ContextualKeyword["_protected"] = _protected] = "_protected"; | ||||
|   const _proto = _protected + 1; ContextualKeyword[ContextualKeyword["_proto"] = _proto] = "_proto"; | ||||
|   const _public = _proto + 1; ContextualKeyword[ContextualKeyword["_public"] = _public] = "_public"; | ||||
|   const _readonly = _public + 1; ContextualKeyword[ContextualKeyword["_readonly"] = _readonly] = "_readonly"; | ||||
|   const _require = _readonly + 1; ContextualKeyword[ContextualKeyword["_require"] = _require] = "_require"; | ||||
|   const _satisfies = _require + 1; ContextualKeyword[ContextualKeyword["_satisfies"] = _satisfies] = "_satisfies"; | ||||
|   const _set = _satisfies + 1; ContextualKeyword[ContextualKeyword["_set"] = _set] = "_set"; | ||||
|   const _static = _set + 1; ContextualKeyword[ContextualKeyword["_static"] = _static] = "_static"; | ||||
|   const _symbol = _static + 1; ContextualKeyword[ContextualKeyword["_symbol"] = _symbol] = "_symbol"; | ||||
|   const _type = _symbol + 1; ContextualKeyword[ContextualKeyword["_type"] = _type] = "_type"; | ||||
|   const _unique = _type + 1; ContextualKeyword[ContextualKeyword["_unique"] = _unique] = "_unique"; | ||||
|   const _using = _unique + 1; ContextualKeyword[ContextualKeyword["_using"] = _using] = "_using"; | ||||
| })(ContextualKeyword || (ContextualKeyword = {})); | ||||
							
								
								
									
										64
									
								
								node_modules/sucrase/dist/esm/parser/tokenizer/readWord.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										64
									
								
								node_modules/sucrase/dist/esm/parser/tokenizer/readWord.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,64 @@ | |||
| import {input, state} from "../traverser/base"; | ||||
| import {charCodes} from "../util/charcodes"; | ||||
| import {IS_IDENTIFIER_CHAR} from "../util/identifier"; | ||||
| import {finishToken} from "./index"; | ||||
| import {READ_WORD_TREE} from "./readWordTree"; | ||||
| import {TokenType as tt} from "./types"; | ||||
| 
 | ||||
| /** | ||||
|  * Read an identifier, producing either a name token or matching on one of the existing keywords. | ||||
|  * For performance, we pre-generate big decision tree that we traverse. Each node represents a | ||||
|  * prefix and has 27 values, where the first value is the token or contextual token, if any (-1 if | ||||
|  * not), and the other 26 values are the transitions to other nodes, or -1 to stop. | ||||
|  */ | ||||
| export default function readWord() { | ||||
|   let treePos = 0; | ||||
|   let code = 0; | ||||
|   let pos = state.pos; | ||||
|   while (pos < input.length) { | ||||
|     code = input.charCodeAt(pos); | ||||
|     if (code < charCodes.lowercaseA || code > charCodes.lowercaseZ) { | ||||
|       break; | ||||
|     } | ||||
|     const next = READ_WORD_TREE[treePos + (code - charCodes.lowercaseA) + 1]; | ||||
|     if (next === -1) { | ||||
|       break; | ||||
|     } else { | ||||
|       treePos = next; | ||||
|       pos++; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   const keywordValue = READ_WORD_TREE[treePos]; | ||||
|   if (keywordValue > -1 && !IS_IDENTIFIER_CHAR[code]) { | ||||
|     state.pos = pos; | ||||
|     if (keywordValue & 1) { | ||||
|       finishToken(keywordValue >>> 1); | ||||
|     } else { | ||||
|       finishToken(tt.name, keywordValue >>> 1); | ||||
|     } | ||||
|     return; | ||||
|   } | ||||
| 
 | ||||
|   while (pos < input.length) { | ||||
|     const ch = input.charCodeAt(pos); | ||||
|     if (IS_IDENTIFIER_CHAR[ch]) { | ||||
|       pos++; | ||||
|     } else if (ch === charCodes.backslash) { | ||||
|       // \u
 | ||||
|       pos += 2; | ||||
|       if (input.charCodeAt(pos) === charCodes.leftCurlyBrace) { | ||||
|         while (pos < input.length && input.charCodeAt(pos) !== charCodes.rightCurlyBrace) { | ||||
|           pos++; | ||||
|         } | ||||
|         pos++; | ||||
|       } | ||||
|     } else if (ch === charCodes.atSign && input.charCodeAt(pos + 1) === charCodes.atSign) { | ||||
|       pos += 2; | ||||
|     } else { | ||||
|       break; | ||||
|     } | ||||
|   } | ||||
|   state.pos = pos; | ||||
|   finishToken(tt.name); | ||||
| } | ||||
							
								
								
									
										671
									
								
								node_modules/sucrase/dist/esm/parser/tokenizer/readWordTree.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										671
									
								
								node_modules/sucrase/dist/esm/parser/tokenizer/readWordTree.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,671 @@ | |||
| // Generated file, do not edit! Run "yarn generate" to re-generate this file.
 | ||||
| import {ContextualKeyword} from "./keywords"; | ||||
| import {TokenType as tt} from "./types"; | ||||
| 
 | ||||
| // prettier-ignore
 | ||||
| export const READ_WORD_TREE = new Int32Array([ | ||||
|   // ""
 | ||||
|   -1, 27, 783, 918, 1755, 2376, 2862, 3483, -1, 3699, -1, 4617, 4752, 4833, 5130, 5508, 5940, -1, 6480, 6939, 7749, 8181, 8451, 8613, -1, 8829, -1, | ||||
|   // "a"
 | ||||
|   -1, -1, 54, 243, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 432, -1, -1, -1, 675, -1, -1, -1, | ||||
|   // "ab"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 81, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "abs"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 108, -1, -1, -1, -1, -1, -1, | ||||
|   // "abst"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 135, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "abstr"
 | ||||
|   -1, 162, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "abstra"
 | ||||
|   -1, -1, -1, 189, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "abstrac"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 216, -1, -1, -1, -1, -1, -1, | ||||
|   // "abstract"
 | ||||
|   ContextualKeyword._abstract << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ac"
 | ||||
|   -1, -1, -1, 270, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "acc"
 | ||||
|   -1, -1, -1, -1, -1, 297, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "acce"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 324, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "acces"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 351, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "access"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 378, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "accesso"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 405, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "accessor"
 | ||||
|   ContextualKeyword._accessor << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "as"
 | ||||
|   ContextualKeyword._as << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 459, -1, -1, -1, -1, -1, 594, -1, | ||||
|   // "ass"
 | ||||
|   -1, -1, -1, -1, -1, 486, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "asse"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 513, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "asser"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 540, -1, -1, -1, -1, -1, -1, | ||||
|   // "assert"
 | ||||
|   ContextualKeyword._assert << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 567, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "asserts"
 | ||||
|   ContextualKeyword._asserts << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "asy"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 621, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "asyn"
 | ||||
|   -1, -1, -1, 648, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "async"
 | ||||
|   ContextualKeyword._async << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "aw"
 | ||||
|   -1, 702, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "awa"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 729, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "awai"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 756, -1, -1, -1, -1, -1, -1, | ||||
|   // "await"
 | ||||
|   ContextualKeyword._await << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "b"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 810, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "br"
 | ||||
|   -1, -1, -1, -1, -1, 837, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "bre"
 | ||||
|   -1, 864, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "brea"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 891, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "break"
 | ||||
|   (tt._break << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "c"
 | ||||
|   -1, 945, -1, -1, -1, -1, -1, -1, 1107, -1, -1, -1, 1242, -1, -1, 1350, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ca"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 972, 1026, -1, -1, -1, -1, -1, -1, | ||||
|   // "cas"
 | ||||
|   -1, -1, -1, -1, -1, 999, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "case"
 | ||||
|   (tt._case << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "cat"
 | ||||
|   -1, -1, -1, 1053, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "catc"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, 1080, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "catch"
 | ||||
|   (tt._catch << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ch"
 | ||||
|   -1, -1, -1, -1, -1, 1134, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "che"
 | ||||
|   -1, -1, -1, 1161, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "chec"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1188, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "check"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1215, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "checks"
 | ||||
|   ContextualKeyword._checks << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "cl"
 | ||||
|   -1, 1269, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "cla"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1296, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "clas"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1323, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "class"
 | ||||
|   (tt._class << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "co"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1377, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "con"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1404, 1620, -1, -1, -1, -1, -1, -1, | ||||
|   // "cons"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1431, -1, -1, -1, -1, -1, -1, | ||||
|   // "const"
 | ||||
|   (tt._const << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1458, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "constr"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1485, -1, -1, -1, -1, -1, | ||||
|   // "constru"
 | ||||
|   -1, -1, -1, 1512, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "construc"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1539, -1, -1, -1, -1, -1, -1, | ||||
|   // "construct"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1566, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "constructo"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1593, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "constructor"
 | ||||
|   ContextualKeyword._constructor << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "cont"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 1647, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "conti"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1674, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "contin"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1701, -1, -1, -1, -1, -1, | ||||
|   // "continu"
 | ||||
|   -1, -1, -1, -1, -1, 1728, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "continue"
 | ||||
|   (tt._continue << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "d"
 | ||||
|   -1, -1, -1, -1, -1, 1782, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2349, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "de"
 | ||||
|   -1, -1, 1809, 1971, -1, -1, 2106, -1, -1, -1, -1, -1, 2241, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "deb"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1836, -1, -1, -1, -1, -1, | ||||
|   // "debu"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, 1863, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "debug"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, 1890, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "debugg"
 | ||||
|   -1, -1, -1, -1, -1, 1917, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "debugge"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1944, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "debugger"
 | ||||
|   (tt._debugger << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "dec"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1998, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "decl"
 | ||||
|   -1, 2025, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "decla"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2052, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "declar"
 | ||||
|   -1, -1, -1, -1, -1, 2079, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "declare"
 | ||||
|   ContextualKeyword._declare << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "def"
 | ||||
|   -1, 2133, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "defa"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2160, -1, -1, -1, -1, -1, | ||||
|   // "defau"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2187, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "defaul"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2214, -1, -1, -1, -1, -1, -1, | ||||
|   // "default"
 | ||||
|   (tt._default << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "del"
 | ||||
|   -1, -1, -1, -1, -1, 2268, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "dele"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2295, -1, -1, -1, -1, -1, -1, | ||||
|   // "delet"
 | ||||
|   -1, -1, -1, -1, -1, 2322, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "delete"
 | ||||
|   (tt._delete << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "do"
 | ||||
|   (tt._do << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "e"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2403, -1, 2484, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2565, -1, -1, | ||||
|   // "el"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2430, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "els"
 | ||||
|   -1, -1, -1, -1, -1, 2457, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "else"
 | ||||
|   (tt._else << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "en"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2511, -1, -1, -1, -1, -1, | ||||
|   // "enu"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2538, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "enum"
 | ||||
|   ContextualKeyword._enum << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ex"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2592, -1, -1, -1, 2727, -1, -1, -1, -1, -1, -1, | ||||
|   // "exp"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2619, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "expo"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2646, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "expor"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2673, -1, -1, -1, -1, -1, -1, | ||||
|   // "export"
 | ||||
|   (tt._export << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2700, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "exports"
 | ||||
|   ContextualKeyword._exports << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ext"
 | ||||
|   -1, -1, -1, -1, -1, 2754, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "exte"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2781, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "exten"
 | ||||
|   -1, -1, -1, -1, 2808, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "extend"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2835, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "extends"
 | ||||
|   (tt._extends << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "f"
 | ||||
|   -1, 2889, -1, -1, -1, -1, -1, -1, -1, 2997, -1, -1, -1, -1, -1, 3159, -1, -1, 3213, -1, -1, 3294, -1, -1, -1, -1, -1, | ||||
|   // "fa"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2916, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fal"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2943, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fals"
 | ||||
|   -1, -1, -1, -1, -1, 2970, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "false"
 | ||||
|   (tt._false << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3024, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fin"
 | ||||
|   -1, 3051, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fina"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3078, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "final"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3105, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "finall"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3132, -1, | ||||
|   // "finally"
 | ||||
|   (tt._finally << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fo"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3186, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "for"
 | ||||
|   (tt._for << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fr"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3240, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fro"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3267, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "from"
 | ||||
|   ContextualKeyword._from << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fu"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3321, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fun"
 | ||||
|   -1, -1, -1, 3348, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "func"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3375, -1, -1, -1, -1, -1, -1, | ||||
|   // "funct"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 3402, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "functi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3429, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "functio"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3456, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "function"
 | ||||
|   (tt._function << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "g"
 | ||||
|   -1, -1, -1, -1, -1, 3510, -1, -1, -1, -1, -1, -1, 3564, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ge"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3537, -1, -1, -1, -1, -1, -1, | ||||
|   // "get"
 | ||||
|   ContextualKeyword._get << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "gl"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3591, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "glo"
 | ||||
|   -1, -1, 3618, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "glob"
 | ||||
|   -1, 3645, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "globa"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3672, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "global"
 | ||||
|   ContextualKeyword._global << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "i"
 | ||||
|   -1, -1, -1, -1, -1, -1, 3726, -1, -1, -1, -1, -1, -1, 3753, 4077, -1, -1, -1, -1, 4590, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "if"
 | ||||
|   (tt._if << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "im"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3780, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "imp"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3807, -1, -1, 3996, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "impl"
 | ||||
|   -1, -1, -1, -1, -1, 3834, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "imple"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3861, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "implem"
 | ||||
|   -1, -1, -1, -1, -1, 3888, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "impleme"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3915, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "implemen"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3942, -1, -1, -1, -1, -1, -1, | ||||
|   // "implement"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3969, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "implements"
 | ||||
|   ContextualKeyword._implements << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "impo"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4023, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "impor"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4050, -1, -1, -1, -1, -1, -1, | ||||
|   // "import"
 | ||||
|   (tt._import << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "in"
 | ||||
|   (tt._in << 1) + 1, -1, -1, -1, -1, -1, 4104, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4185, 4401, -1, -1, -1, -1, -1, -1, | ||||
|   // "inf"
 | ||||
|   -1, -1, -1, -1, -1, 4131, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "infe"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4158, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "infer"
 | ||||
|   ContextualKeyword._infer << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ins"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4212, -1, -1, -1, -1, -1, -1, | ||||
|   // "inst"
 | ||||
|   -1, 4239, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "insta"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4266, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "instan"
 | ||||
|   -1, -1, -1, 4293, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "instanc"
 | ||||
|   -1, -1, -1, -1, -1, 4320, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "instance"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4347, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "instanceo"
 | ||||
|   -1, -1, -1, -1, -1, -1, 4374, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "instanceof"
 | ||||
|   (tt._instanceof << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "int"
 | ||||
|   -1, -1, -1, -1, -1, 4428, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "inte"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4455, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "inter"
 | ||||
|   -1, -1, -1, -1, -1, -1, 4482, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "interf"
 | ||||
|   -1, 4509, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "interfa"
 | ||||
|   -1, -1, -1, 4536, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "interfac"
 | ||||
|   -1, -1, -1, -1, -1, 4563, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "interface"
 | ||||
|   ContextualKeyword._interface << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "is"
 | ||||
|   ContextualKeyword._is << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "k"
 | ||||
|   -1, -1, -1, -1, -1, 4644, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ke"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4671, -1, | ||||
|   // "key"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4698, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "keyo"
 | ||||
|   -1, -1, -1, -1, -1, -1, 4725, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "keyof"
 | ||||
|   ContextualKeyword._keyof << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "l"
 | ||||
|   -1, -1, -1, -1, -1, 4779, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "le"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4806, -1, -1, -1, -1, -1, -1, | ||||
|   // "let"
 | ||||
|   (tt._let << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "m"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 4860, -1, -1, -1, -1, -1, 4995, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "mi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4887, -1, -1, | ||||
|   // "mix"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 4914, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "mixi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4941, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "mixin"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4968, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "mixins"
 | ||||
|   ContextualKeyword._mixins << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "mo"
 | ||||
|   -1, -1, -1, -1, 5022, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "mod"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5049, -1, -1, -1, -1, -1, | ||||
|   // "modu"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5076, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "modul"
 | ||||
|   -1, -1, -1, -1, -1, 5103, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "module"
 | ||||
|   ContextualKeyword._module << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "n"
 | ||||
|   -1, 5157, -1, -1, -1, 5373, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5427, -1, -1, -1, -1, -1, | ||||
|   // "na"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5184, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "nam"
 | ||||
|   -1, -1, -1, -1, -1, 5211, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "name"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5238, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "names"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5265, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "namesp"
 | ||||
|   -1, 5292, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "namespa"
 | ||||
|   -1, -1, -1, 5319, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "namespac"
 | ||||
|   -1, -1, -1, -1, -1, 5346, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "namespace"
 | ||||
|   ContextualKeyword._namespace << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ne"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5400, -1, -1, -1, | ||||
|   // "new"
 | ||||
|   (tt._new << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "nu"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5454, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "nul"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5481, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "null"
 | ||||
|   (tt._null << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "o"
 | ||||
|   -1, -1, -1, -1, -1, -1, 5535, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5562, -1, -1, -1, -1, 5697, 5751, -1, -1, -1, -1, | ||||
|   // "of"
 | ||||
|   ContextualKeyword._of << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "op"
 | ||||
|   -1, 5589, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "opa"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5616, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "opaq"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5643, -1, -1, -1, -1, -1, | ||||
|   // "opaqu"
 | ||||
|   -1, -1, -1, -1, -1, 5670, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "opaque"
 | ||||
|   ContextualKeyword._opaque << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ou"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5724, -1, -1, -1, -1, -1, -1, | ||||
|   // "out"
 | ||||
|   ContextualKeyword._out << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ov"
 | ||||
|   -1, -1, -1, -1, -1, 5778, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ove"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5805, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "over"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5832, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "overr"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 5859, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "overri"
 | ||||
|   -1, -1, -1, -1, 5886, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "overrid"
 | ||||
|   -1, -1, -1, -1, -1, 5913, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "override"
 | ||||
|   ContextualKeyword._override << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "p"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5967, -1, -1, 6345, -1, -1, -1, -1, -1, | ||||
|   // "pr"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 5994, -1, -1, -1, -1, -1, 6129, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "pri"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6021, -1, -1, -1, -1, | ||||
|   // "priv"
 | ||||
|   -1, 6048, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "priva"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6075, -1, -1, -1, -1, -1, -1, | ||||
|   // "privat"
 | ||||
|   -1, -1, -1, -1, -1, 6102, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "private"
 | ||||
|   ContextualKeyword._private << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "pro"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6156, -1, -1, -1, -1, -1, -1, | ||||
|   // "prot"
 | ||||
|   -1, -1, -1, -1, -1, 6183, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6318, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "prote"
 | ||||
|   -1, -1, -1, 6210, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "protec"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6237, -1, -1, -1, -1, -1, -1, | ||||
|   // "protect"
 | ||||
|   -1, -1, -1, -1, -1, 6264, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "protecte"
 | ||||
|   -1, -1, -1, -1, 6291, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "protected"
 | ||||
|   ContextualKeyword._protected << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "proto"
 | ||||
|   ContextualKeyword._proto << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "pu"
 | ||||
|   -1, -1, 6372, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "pub"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6399, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "publ"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 6426, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "publi"
 | ||||
|   -1, -1, -1, 6453, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "public"
 | ||||
|   ContextualKeyword._public << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "r"
 | ||||
|   -1, -1, -1, -1, -1, 6507, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "re"
 | ||||
|   -1, 6534, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6696, -1, -1, 6831, -1, -1, -1, -1, -1, -1, | ||||
|   // "rea"
 | ||||
|   -1, -1, -1, -1, 6561, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "read"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6588, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "reado"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6615, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "readon"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6642, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "readonl"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6669, -1, | ||||
|   // "readonly"
 | ||||
|   ContextualKeyword._readonly << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "req"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6723, -1, -1, -1, -1, -1, | ||||
|   // "requ"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 6750, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "requi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6777, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "requir"
 | ||||
|   -1, -1, -1, -1, -1, 6804, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "require"
 | ||||
|   ContextualKeyword._require << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ret"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6858, -1, -1, -1, -1, -1, | ||||
|   // "retu"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6885, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "retur"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6912, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "return"
 | ||||
|   (tt._return << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "s"
 | ||||
|   -1, 6966, -1, -1, -1, 7182, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7236, 7371, -1, 7479, -1, 7614, -1, | ||||
|   // "sa"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6993, -1, -1, -1, -1, -1, -1, | ||||
|   // "sat"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 7020, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "sati"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7047, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "satis"
 | ||||
|   -1, -1, -1, -1, -1, -1, 7074, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "satisf"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 7101, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "satisfi"
 | ||||
|   -1, -1, -1, -1, -1, 7128, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "satisfie"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7155, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "satisfies"
 | ||||
|   ContextualKeyword._satisfies << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "se"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7209, -1, -1, -1, -1, -1, -1, | ||||
|   // "set"
 | ||||
|   ContextualKeyword._set << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "st"
 | ||||
|   -1, 7263, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "sta"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7290, -1, -1, -1, -1, -1, -1, | ||||
|   // "stat"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 7317, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "stati"
 | ||||
|   -1, -1, -1, 7344, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "static"
 | ||||
|   ContextualKeyword._static << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "su"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7398, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "sup"
 | ||||
|   -1, -1, -1, -1, -1, 7425, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "supe"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7452, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "super"
 | ||||
|   (tt._super << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "sw"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 7506, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "swi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7533, -1, -1, -1, -1, -1, -1, | ||||
|   // "swit"
 | ||||
|   -1, -1, -1, 7560, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "switc"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, 7587, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "switch"
 | ||||
|   (tt._switch << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "sy"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7641, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "sym"
 | ||||
|   -1, -1, 7668, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "symb"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7695, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "symbo"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7722, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "symbol"
 | ||||
|   ContextualKeyword._symbol << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "t"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, 7776, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7938, -1, -1, -1, -1, -1, -1, 8046, -1, | ||||
|   // "th"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 7803, -1, -1, -1, -1, -1, -1, -1, -1, 7857, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "thi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7830, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "this"
 | ||||
|   (tt._this << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "thr"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7884, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "thro"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7911, -1, -1, -1, | ||||
|   // "throw"
 | ||||
|   (tt._throw << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "tr"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7965, -1, -1, -1, 8019, -1, | ||||
|   // "tru"
 | ||||
|   -1, -1, -1, -1, -1, 7992, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "true"
 | ||||
|   (tt._true << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "try"
 | ||||
|   (tt._try << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ty"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8073, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "typ"
 | ||||
|   -1, -1, -1, -1, -1, 8100, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "type"
 | ||||
|   ContextualKeyword._type << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8127, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "typeo"
 | ||||
|   -1, -1, -1, -1, -1, -1, 8154, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "typeof"
 | ||||
|   (tt._typeof << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "u"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8208, -1, -1, -1, -1, 8343, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "un"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 8235, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "uni"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8262, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "uniq"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8289, -1, -1, -1, -1, -1, | ||||
|   // "uniqu"
 | ||||
|   -1, -1, -1, -1, -1, 8316, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "unique"
 | ||||
|   ContextualKeyword._unique << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "us"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 8370, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "usi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8397, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "usin"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, 8424, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "using"
 | ||||
|   ContextualKeyword._using << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "v"
 | ||||
|   -1, 8478, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8532, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "va"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8505, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "var"
 | ||||
|   (tt._var << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "vo"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 8559, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "voi"
 | ||||
|   -1, -1, -1, -1, 8586, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "void"
 | ||||
|   (tt._void << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "w"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, 8640, 8748, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "wh"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 8667, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "whi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8694, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "whil"
 | ||||
|   -1, -1, -1, -1, -1, 8721, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "while"
 | ||||
|   (tt._while << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "wi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8775, -1, -1, -1, -1, -1, -1, | ||||
|   // "wit"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, 8802, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "with"
 | ||||
|   (tt._with << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "y"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 8856, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "yi"
 | ||||
|   -1, -1, -1, -1, -1, 8883, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "yie"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8910, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "yiel"
 | ||||
|   -1, -1, -1, -1, 8937, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "yield"
 | ||||
|   (tt._yield << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
| ]); | ||||
							
								
								
									
										106
									
								
								node_modules/sucrase/dist/esm/parser/tokenizer/state.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										106
									
								
								node_modules/sucrase/dist/esm/parser/tokenizer/state.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,106 @@ | |||
| 
 | ||||
| import {ContextualKeyword} from "./keywords"; | ||||
| import { TokenType as tt} from "./types"; | ||||
| 
 | ||||
| export class Scope { | ||||
|    | ||||
|    | ||||
|    | ||||
| 
 | ||||
|   constructor(startTokenIndex, endTokenIndex, isFunctionScope) { | ||||
|     this.startTokenIndex = startTokenIndex; | ||||
|     this.endTokenIndex = endTokenIndex; | ||||
|     this.isFunctionScope = isFunctionScope; | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| export class StateSnapshot { | ||||
|   constructor( | ||||
|      potentialArrowAt, | ||||
|      noAnonFunctionType, | ||||
|      inDisallowConditionalTypesContext, | ||||
|      tokensLength, | ||||
|      scopesLength, | ||||
|      pos, | ||||
|      type, | ||||
|      contextualKeyword, | ||||
|      start, | ||||
|      end, | ||||
|      isType, | ||||
|      scopeDepth, | ||||
|      error, | ||||
|   ) {;this.potentialArrowAt = potentialArrowAt;this.noAnonFunctionType = noAnonFunctionType;this.inDisallowConditionalTypesContext = inDisallowConditionalTypesContext;this.tokensLength = tokensLength;this.scopesLength = scopesLength;this.pos = pos;this.type = type;this.contextualKeyword = contextualKeyword;this.start = start;this.end = end;this.isType = isType;this.scopeDepth = scopeDepth;this.error = error;} | ||||
| } | ||||
| 
 | ||||
| export default class State {constructor() { State.prototype.__init.call(this);State.prototype.__init2.call(this);State.prototype.__init3.call(this);State.prototype.__init4.call(this);State.prototype.__init5.call(this);State.prototype.__init6.call(this);State.prototype.__init7.call(this);State.prototype.__init8.call(this);State.prototype.__init9.call(this);State.prototype.__init10.call(this);State.prototype.__init11.call(this);State.prototype.__init12.call(this);State.prototype.__init13.call(this); } | ||||
|   // Used to signify the start of a potential arrow function
 | ||||
|   __init() {this.potentialArrowAt = -1} | ||||
| 
 | ||||
|   // Used by Flow to handle an edge case involving function type parsing.
 | ||||
|   __init2() {this.noAnonFunctionType = false} | ||||
| 
 | ||||
|   // Used by TypeScript to handle ambiguities when parsing conditional types.
 | ||||
|   __init3() {this.inDisallowConditionalTypesContext = false} | ||||
| 
 | ||||
|   // Token store.
 | ||||
|   __init4() {this.tokens = []} | ||||
| 
 | ||||
|   // Array of all observed scopes, ordered by their ending position.
 | ||||
|   __init5() {this.scopes = []} | ||||
| 
 | ||||
|   // The current position of the tokenizer in the input.
 | ||||
|   __init6() {this.pos = 0} | ||||
| 
 | ||||
|   // Information about the current token.
 | ||||
|   __init7() {this.type = tt.eof} | ||||
|   __init8() {this.contextualKeyword = ContextualKeyword.NONE} | ||||
|   __init9() {this.start = 0} | ||||
|   __init10() {this.end = 0} | ||||
| 
 | ||||
|   __init11() {this.isType = false} | ||||
|   __init12() {this.scopeDepth = 0} | ||||
| 
 | ||||
|   /** | ||||
|    * If the parser is in an error state, then the token is always tt.eof and all functions can | ||||
|    * keep executing but should be written so they don't get into an infinite loop in this situation. | ||||
|    * | ||||
|    * This approach, combined with the ability to snapshot and restore state, allows us to implement | ||||
|    * backtracking without exceptions and without needing to explicitly propagate error states | ||||
|    * everywhere. | ||||
|    */ | ||||
|   __init13() {this.error = null} | ||||
| 
 | ||||
|   snapshot() { | ||||
|     return new StateSnapshot( | ||||
|       this.potentialArrowAt, | ||||
|       this.noAnonFunctionType, | ||||
|       this.inDisallowConditionalTypesContext, | ||||
|       this.tokens.length, | ||||
|       this.scopes.length, | ||||
|       this.pos, | ||||
|       this.type, | ||||
|       this.contextualKeyword, | ||||
|       this.start, | ||||
|       this.end, | ||||
|       this.isType, | ||||
|       this.scopeDepth, | ||||
|       this.error, | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|   restoreFromSnapshot(snapshot) { | ||||
|     this.potentialArrowAt = snapshot.potentialArrowAt; | ||||
|     this.noAnonFunctionType = snapshot.noAnonFunctionType; | ||||
|     this.inDisallowConditionalTypesContext = snapshot.inDisallowConditionalTypesContext; | ||||
|     this.tokens.length = snapshot.tokensLength; | ||||
|     this.scopes.length = snapshot.scopesLength; | ||||
|     this.pos = snapshot.pos; | ||||
|     this.type = snapshot.type; | ||||
|     this.contextualKeyword = snapshot.contextualKeyword; | ||||
|     this.start = snapshot.start; | ||||
|     this.end = snapshot.end; | ||||
|     this.isType = snapshot.isType; | ||||
|     this.scopeDepth = snapshot.scopeDepth; | ||||
|     this.error = snapshot.error; | ||||
|   } | ||||
| } | ||||
							
								
								
									
										361
									
								
								node_modules/sucrase/dist/esm/parser/tokenizer/types.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										361
									
								
								node_modules/sucrase/dist/esm/parser/tokenizer/types.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,361 @@ | |||
| // Generated file, do not edit! Run "yarn generate" to re-generate this file.
 | ||||
| /* istanbul ignore file */ | ||||
| /** | ||||
|  * Enum of all token types, with bit fields to signify meaningful properties. | ||||
|  */ | ||||
| export var TokenType; (function (TokenType) { | ||||
|   // Precedence 0 means not an operator; otherwise it is a positive number up to 12.
 | ||||
|   const PRECEDENCE_MASK = 0xf; TokenType[TokenType["PRECEDENCE_MASK"] = PRECEDENCE_MASK] = "PRECEDENCE_MASK"; | ||||
|   const IS_KEYWORD = 1 << 4; TokenType[TokenType["IS_KEYWORD"] = IS_KEYWORD] = "IS_KEYWORD"; | ||||
|   const IS_ASSIGN = 1 << 5; TokenType[TokenType["IS_ASSIGN"] = IS_ASSIGN] = "IS_ASSIGN"; | ||||
|   const IS_RIGHT_ASSOCIATIVE = 1 << 6; TokenType[TokenType["IS_RIGHT_ASSOCIATIVE"] = IS_RIGHT_ASSOCIATIVE] = "IS_RIGHT_ASSOCIATIVE"; | ||||
|   const IS_PREFIX = 1 << 7; TokenType[TokenType["IS_PREFIX"] = IS_PREFIX] = "IS_PREFIX"; | ||||
|   const IS_POSTFIX = 1 << 8; TokenType[TokenType["IS_POSTFIX"] = IS_POSTFIX] = "IS_POSTFIX"; | ||||
|   const IS_EXPRESSION_START = 1 << 9; TokenType[TokenType["IS_EXPRESSION_START"] = IS_EXPRESSION_START] = "IS_EXPRESSION_START"; | ||||
| 
 | ||||
|   const num = 512; TokenType[TokenType["num"] = num] = "num"; // num startsExpr
 | ||||
|   const bigint = 1536; TokenType[TokenType["bigint"] = bigint] = "bigint"; // bigint startsExpr
 | ||||
|   const decimal = 2560; TokenType[TokenType["decimal"] = decimal] = "decimal"; // decimal startsExpr
 | ||||
|   const regexp = 3584; TokenType[TokenType["regexp"] = regexp] = "regexp"; // regexp startsExpr
 | ||||
|   const string = 4608; TokenType[TokenType["string"] = string] = "string"; // string startsExpr
 | ||||
|   const name = 5632; TokenType[TokenType["name"] = name] = "name"; // name startsExpr
 | ||||
|   const eof = 6144; TokenType[TokenType["eof"] = eof] = "eof"; // eof
 | ||||
|   const bracketL = 7680; TokenType[TokenType["bracketL"] = bracketL] = "bracketL"; // [ startsExpr
 | ||||
|   const bracketR = 8192; TokenType[TokenType["bracketR"] = bracketR] = "bracketR"; // ]
 | ||||
|   const braceL = 9728; TokenType[TokenType["braceL"] = braceL] = "braceL"; // { startsExpr
 | ||||
|   const braceBarL = 10752; TokenType[TokenType["braceBarL"] = braceBarL] = "braceBarL"; // {| startsExpr
 | ||||
|   const braceR = 11264; TokenType[TokenType["braceR"] = braceR] = "braceR"; // }
 | ||||
|   const braceBarR = 12288; TokenType[TokenType["braceBarR"] = braceBarR] = "braceBarR"; // |}
 | ||||
|   const parenL = 13824; TokenType[TokenType["parenL"] = parenL] = "parenL"; // ( startsExpr
 | ||||
|   const parenR = 14336; TokenType[TokenType["parenR"] = parenR] = "parenR"; // )
 | ||||
|   const comma = 15360; TokenType[TokenType["comma"] = comma] = "comma"; // ,
 | ||||
|   const semi = 16384; TokenType[TokenType["semi"] = semi] = "semi"; // ;
 | ||||
|   const colon = 17408; TokenType[TokenType["colon"] = colon] = "colon"; // :
 | ||||
|   const doubleColon = 18432; TokenType[TokenType["doubleColon"] = doubleColon] = "doubleColon"; // ::
 | ||||
|   const dot = 19456; TokenType[TokenType["dot"] = dot] = "dot"; // .
 | ||||
|   const question = 20480; TokenType[TokenType["question"] = question] = "question"; // ?
 | ||||
|   const questionDot = 21504; TokenType[TokenType["questionDot"] = questionDot] = "questionDot"; // ?.
 | ||||
|   const arrow = 22528; TokenType[TokenType["arrow"] = arrow] = "arrow"; // =>
 | ||||
|   const template = 23552; TokenType[TokenType["template"] = template] = "template"; // template
 | ||||
|   const ellipsis = 24576; TokenType[TokenType["ellipsis"] = ellipsis] = "ellipsis"; // ...
 | ||||
|   const backQuote = 25600; TokenType[TokenType["backQuote"] = backQuote] = "backQuote"; // `
 | ||||
|   const dollarBraceL = 27136; TokenType[TokenType["dollarBraceL"] = dollarBraceL] = "dollarBraceL"; // ${ startsExpr
 | ||||
|   const at = 27648; TokenType[TokenType["at"] = at] = "at"; // @
 | ||||
|   const hash = 29184; TokenType[TokenType["hash"] = hash] = "hash"; // # startsExpr
 | ||||
|   const eq = 29728; TokenType[TokenType["eq"] = eq] = "eq"; // = isAssign
 | ||||
|   const assign = 30752; TokenType[TokenType["assign"] = assign] = "assign"; // _= isAssign
 | ||||
|   const preIncDec = 32640; TokenType[TokenType["preIncDec"] = preIncDec] = "preIncDec"; // ++/-- prefix postfix startsExpr
 | ||||
|   const postIncDec = 33664; TokenType[TokenType["postIncDec"] = postIncDec] = "postIncDec"; // ++/-- prefix postfix startsExpr
 | ||||
|   const bang = 34432; TokenType[TokenType["bang"] = bang] = "bang"; // ! prefix startsExpr
 | ||||
|   const tilde = 35456; TokenType[TokenType["tilde"] = tilde] = "tilde"; // ~ prefix startsExpr
 | ||||
|   const pipeline = 35841; TokenType[TokenType["pipeline"] = pipeline] = "pipeline"; // |> prec:1
 | ||||
|   const nullishCoalescing = 36866; TokenType[TokenType["nullishCoalescing"] = nullishCoalescing] = "nullishCoalescing"; // ?? prec:2
 | ||||
|   const logicalOR = 37890; TokenType[TokenType["logicalOR"] = logicalOR] = "logicalOR"; // || prec:2
 | ||||
|   const logicalAND = 38915; TokenType[TokenType["logicalAND"] = logicalAND] = "logicalAND"; // && prec:3
 | ||||
|   const bitwiseOR = 39940; TokenType[TokenType["bitwiseOR"] = bitwiseOR] = "bitwiseOR"; // | prec:4
 | ||||
|   const bitwiseXOR = 40965; TokenType[TokenType["bitwiseXOR"] = bitwiseXOR] = "bitwiseXOR"; // ^ prec:5
 | ||||
|   const bitwiseAND = 41990; TokenType[TokenType["bitwiseAND"] = bitwiseAND] = "bitwiseAND"; // & prec:6
 | ||||
|   const equality = 43015; TokenType[TokenType["equality"] = equality] = "equality"; // ==/!= prec:7
 | ||||
|   const lessThan = 44040; TokenType[TokenType["lessThan"] = lessThan] = "lessThan"; // < prec:8
 | ||||
|   const greaterThan = 45064; TokenType[TokenType["greaterThan"] = greaterThan] = "greaterThan"; // > prec:8
 | ||||
|   const relationalOrEqual = 46088; TokenType[TokenType["relationalOrEqual"] = relationalOrEqual] = "relationalOrEqual"; // <=/>= prec:8
 | ||||
|   const bitShiftL = 47113; TokenType[TokenType["bitShiftL"] = bitShiftL] = "bitShiftL"; // << prec:9
 | ||||
|   const bitShiftR = 48137; TokenType[TokenType["bitShiftR"] = bitShiftR] = "bitShiftR"; // >>/>>> prec:9
 | ||||
|   const plus = 49802; TokenType[TokenType["plus"] = plus] = "plus"; // + prec:10 prefix startsExpr
 | ||||
|   const minus = 50826; TokenType[TokenType["minus"] = minus] = "minus"; // - prec:10 prefix startsExpr
 | ||||
|   const modulo = 51723; TokenType[TokenType["modulo"] = modulo] = "modulo"; // % prec:11 startsExpr
 | ||||
|   const star = 52235; TokenType[TokenType["star"] = star] = "star"; // * prec:11
 | ||||
|   const slash = 53259; TokenType[TokenType["slash"] = slash] = "slash"; // / prec:11
 | ||||
|   const exponent = 54348; TokenType[TokenType["exponent"] = exponent] = "exponent"; // ** prec:12 rightAssociative
 | ||||
|   const jsxName = 55296; TokenType[TokenType["jsxName"] = jsxName] = "jsxName"; // jsxName
 | ||||
|   const jsxText = 56320; TokenType[TokenType["jsxText"] = jsxText] = "jsxText"; // jsxText
 | ||||
|   const jsxEmptyText = 57344; TokenType[TokenType["jsxEmptyText"] = jsxEmptyText] = "jsxEmptyText"; // jsxEmptyText
 | ||||
|   const jsxTagStart = 58880; TokenType[TokenType["jsxTagStart"] = jsxTagStart] = "jsxTagStart"; // jsxTagStart startsExpr
 | ||||
|   const jsxTagEnd = 59392; TokenType[TokenType["jsxTagEnd"] = jsxTagEnd] = "jsxTagEnd"; // jsxTagEnd
 | ||||
|   const typeParameterStart = 60928; TokenType[TokenType["typeParameterStart"] = typeParameterStart] = "typeParameterStart"; // typeParameterStart startsExpr
 | ||||
|   const nonNullAssertion = 61440; TokenType[TokenType["nonNullAssertion"] = nonNullAssertion] = "nonNullAssertion"; // nonNullAssertion
 | ||||
|   const _break = 62480; TokenType[TokenType["_break"] = _break] = "_break"; // break keyword
 | ||||
|   const _case = 63504; TokenType[TokenType["_case"] = _case] = "_case"; // case keyword
 | ||||
|   const _catch = 64528; TokenType[TokenType["_catch"] = _catch] = "_catch"; // catch keyword
 | ||||
|   const _continue = 65552; TokenType[TokenType["_continue"] = _continue] = "_continue"; // continue keyword
 | ||||
|   const _debugger = 66576; TokenType[TokenType["_debugger"] = _debugger] = "_debugger"; // debugger keyword
 | ||||
|   const _default = 67600; TokenType[TokenType["_default"] = _default] = "_default"; // default keyword
 | ||||
|   const _do = 68624; TokenType[TokenType["_do"] = _do] = "_do"; // do keyword
 | ||||
|   const _else = 69648; TokenType[TokenType["_else"] = _else] = "_else"; // else keyword
 | ||||
|   const _finally = 70672; TokenType[TokenType["_finally"] = _finally] = "_finally"; // finally keyword
 | ||||
|   const _for = 71696; TokenType[TokenType["_for"] = _for] = "_for"; // for keyword
 | ||||
|   const _function = 73232; TokenType[TokenType["_function"] = _function] = "_function"; // function keyword startsExpr
 | ||||
|   const _if = 73744; TokenType[TokenType["_if"] = _if] = "_if"; // if keyword
 | ||||
|   const _return = 74768; TokenType[TokenType["_return"] = _return] = "_return"; // return keyword
 | ||||
|   const _switch = 75792; TokenType[TokenType["_switch"] = _switch] = "_switch"; // switch keyword
 | ||||
|   const _throw = 77456; TokenType[TokenType["_throw"] = _throw] = "_throw"; // throw keyword prefix startsExpr
 | ||||
|   const _try = 77840; TokenType[TokenType["_try"] = _try] = "_try"; // try keyword
 | ||||
|   const _var = 78864; TokenType[TokenType["_var"] = _var] = "_var"; // var keyword
 | ||||
|   const _let = 79888; TokenType[TokenType["_let"] = _let] = "_let"; // let keyword
 | ||||
|   const _const = 80912; TokenType[TokenType["_const"] = _const] = "_const"; // const keyword
 | ||||
|   const _while = 81936; TokenType[TokenType["_while"] = _while] = "_while"; // while keyword
 | ||||
|   const _with = 82960; TokenType[TokenType["_with"] = _with] = "_with"; // with keyword
 | ||||
|   const _new = 84496; TokenType[TokenType["_new"] = _new] = "_new"; // new keyword startsExpr
 | ||||
|   const _this = 85520; TokenType[TokenType["_this"] = _this] = "_this"; // this keyword startsExpr
 | ||||
|   const _super = 86544; TokenType[TokenType["_super"] = _super] = "_super"; // super keyword startsExpr
 | ||||
|   const _class = 87568; TokenType[TokenType["_class"] = _class] = "_class"; // class keyword startsExpr
 | ||||
|   const _extends = 88080; TokenType[TokenType["_extends"] = _extends] = "_extends"; // extends keyword
 | ||||
|   const _export = 89104; TokenType[TokenType["_export"] = _export] = "_export"; // export keyword
 | ||||
|   const _import = 90640; TokenType[TokenType["_import"] = _import] = "_import"; // import keyword startsExpr
 | ||||
|   const _yield = 91664; TokenType[TokenType["_yield"] = _yield] = "_yield"; // yield keyword startsExpr
 | ||||
|   const _null = 92688; TokenType[TokenType["_null"] = _null] = "_null"; // null keyword startsExpr
 | ||||
|   const _true = 93712; TokenType[TokenType["_true"] = _true] = "_true"; // true keyword startsExpr
 | ||||
|   const _false = 94736; TokenType[TokenType["_false"] = _false] = "_false"; // false keyword startsExpr
 | ||||
|   const _in = 95256; TokenType[TokenType["_in"] = _in] = "_in"; // in prec:8 keyword
 | ||||
|   const _instanceof = 96280; TokenType[TokenType["_instanceof"] = _instanceof] = "_instanceof"; // instanceof prec:8 keyword
 | ||||
|   const _typeof = 97936; TokenType[TokenType["_typeof"] = _typeof] = "_typeof"; // typeof keyword prefix startsExpr
 | ||||
|   const _void = 98960; TokenType[TokenType["_void"] = _void] = "_void"; // void keyword prefix startsExpr
 | ||||
|   const _delete = 99984; TokenType[TokenType["_delete"] = _delete] = "_delete"; // delete keyword prefix startsExpr
 | ||||
|   const _async = 100880; TokenType[TokenType["_async"] = _async] = "_async"; // async keyword startsExpr
 | ||||
|   const _get = 101904; TokenType[TokenType["_get"] = _get] = "_get"; // get keyword startsExpr
 | ||||
|   const _set = 102928; TokenType[TokenType["_set"] = _set] = "_set"; // set keyword startsExpr
 | ||||
|   const _declare = 103952; TokenType[TokenType["_declare"] = _declare] = "_declare"; // declare keyword startsExpr
 | ||||
|   const _readonly = 104976; TokenType[TokenType["_readonly"] = _readonly] = "_readonly"; // readonly keyword startsExpr
 | ||||
|   const _abstract = 106000; TokenType[TokenType["_abstract"] = _abstract] = "_abstract"; // abstract keyword startsExpr
 | ||||
|   const _static = 107024; TokenType[TokenType["_static"] = _static] = "_static"; // static keyword startsExpr
 | ||||
|   const _public = 107536; TokenType[TokenType["_public"] = _public] = "_public"; // public keyword
 | ||||
|   const _private = 108560; TokenType[TokenType["_private"] = _private] = "_private"; // private keyword
 | ||||
|   const _protected = 109584; TokenType[TokenType["_protected"] = _protected] = "_protected"; // protected keyword
 | ||||
|   const _override = 110608; TokenType[TokenType["_override"] = _override] = "_override"; // override keyword
 | ||||
|   const _as = 112144; TokenType[TokenType["_as"] = _as] = "_as"; // as keyword startsExpr
 | ||||
|   const _enum = 113168; TokenType[TokenType["_enum"] = _enum] = "_enum"; // enum keyword startsExpr
 | ||||
|   const _type = 114192; TokenType[TokenType["_type"] = _type] = "_type"; // type keyword startsExpr
 | ||||
|   const _implements = 115216; TokenType[TokenType["_implements"] = _implements] = "_implements"; // implements keyword startsExpr
 | ||||
| })(TokenType || (TokenType = {})); | ||||
| export function formatTokenType(tokenType) { | ||||
|   switch (tokenType) { | ||||
|     case TokenType.num: | ||||
|       return "num"; | ||||
|     case TokenType.bigint: | ||||
|       return "bigint"; | ||||
|     case TokenType.decimal: | ||||
|       return "decimal"; | ||||
|     case TokenType.regexp: | ||||
|       return "regexp"; | ||||
|     case TokenType.string: | ||||
|       return "string"; | ||||
|     case TokenType.name: | ||||
|       return "name"; | ||||
|     case TokenType.eof: | ||||
|       return "eof"; | ||||
|     case TokenType.bracketL: | ||||
|       return "["; | ||||
|     case TokenType.bracketR: | ||||
|       return "]"; | ||||
|     case TokenType.braceL: | ||||
|       return "{"; | ||||
|     case TokenType.braceBarL: | ||||
|       return "{|"; | ||||
|     case TokenType.braceR: | ||||
|       return "}"; | ||||
|     case TokenType.braceBarR: | ||||
|       return "|}"; | ||||
|     case TokenType.parenL: | ||||
|       return "("; | ||||
|     case TokenType.parenR: | ||||
|       return ")"; | ||||
|     case TokenType.comma: | ||||
|       return ","; | ||||
|     case TokenType.semi: | ||||
|       return ";"; | ||||
|     case TokenType.colon: | ||||
|       return ":"; | ||||
|     case TokenType.doubleColon: | ||||
|       return "::"; | ||||
|     case TokenType.dot: | ||||
|       return "."; | ||||
|     case TokenType.question: | ||||
|       return "?"; | ||||
|     case TokenType.questionDot: | ||||
|       return "?."; | ||||
|     case TokenType.arrow: | ||||
|       return "=>"; | ||||
|     case TokenType.template: | ||||
|       return "template"; | ||||
|     case TokenType.ellipsis: | ||||
|       return "..."; | ||||
|     case TokenType.backQuote: | ||||
|       return "`"; | ||||
|     case TokenType.dollarBraceL: | ||||
|       return "${"; | ||||
|     case TokenType.at: | ||||
|       return "@"; | ||||
|     case TokenType.hash: | ||||
|       return "#"; | ||||
|     case TokenType.eq: | ||||
|       return "="; | ||||
|     case TokenType.assign: | ||||
|       return "_="; | ||||
|     case TokenType.preIncDec: | ||||
|       return "++/--"; | ||||
|     case TokenType.postIncDec: | ||||
|       return "++/--"; | ||||
|     case TokenType.bang: | ||||
|       return "!"; | ||||
|     case TokenType.tilde: | ||||
|       return "~"; | ||||
|     case TokenType.pipeline: | ||||
|       return "|>"; | ||||
|     case TokenType.nullishCoalescing: | ||||
|       return "??"; | ||||
|     case TokenType.logicalOR: | ||||
|       return "||"; | ||||
|     case TokenType.logicalAND: | ||||
|       return "&&"; | ||||
|     case TokenType.bitwiseOR: | ||||
|       return "|"; | ||||
|     case TokenType.bitwiseXOR: | ||||
|       return "^"; | ||||
|     case TokenType.bitwiseAND: | ||||
|       return "&"; | ||||
|     case TokenType.equality: | ||||
|       return "==/!="; | ||||
|     case TokenType.lessThan: | ||||
|       return "<"; | ||||
|     case TokenType.greaterThan: | ||||
|       return ">"; | ||||
|     case TokenType.relationalOrEqual: | ||||
|       return "<=/>="; | ||||
|     case TokenType.bitShiftL: | ||||
|       return "<<"; | ||||
|     case TokenType.bitShiftR: | ||||
|       return ">>/>>>"; | ||||
|     case TokenType.plus: | ||||
|       return "+"; | ||||
|     case TokenType.minus: | ||||
|       return "-"; | ||||
|     case TokenType.modulo: | ||||
|       return "%"; | ||||
|     case TokenType.star: | ||||
|       return "*"; | ||||
|     case TokenType.slash: | ||||
|       return "/"; | ||||
|     case TokenType.exponent: | ||||
|       return "**"; | ||||
|     case TokenType.jsxName: | ||||
|       return "jsxName"; | ||||
|     case TokenType.jsxText: | ||||
|       return "jsxText"; | ||||
|     case TokenType.jsxEmptyText: | ||||
|       return "jsxEmptyText"; | ||||
|     case TokenType.jsxTagStart: | ||||
|       return "jsxTagStart"; | ||||
|     case TokenType.jsxTagEnd: | ||||
|       return "jsxTagEnd"; | ||||
|     case TokenType.typeParameterStart: | ||||
|       return "typeParameterStart"; | ||||
|     case TokenType.nonNullAssertion: | ||||
|       return "nonNullAssertion"; | ||||
|     case TokenType._break: | ||||
|       return "break"; | ||||
|     case TokenType._case: | ||||
|       return "case"; | ||||
|     case TokenType._catch: | ||||
|       return "catch"; | ||||
|     case TokenType._continue: | ||||
|       return "continue"; | ||||
|     case TokenType._debugger: | ||||
|       return "debugger"; | ||||
|     case TokenType._default: | ||||
|       return "default"; | ||||
|     case TokenType._do: | ||||
|       return "do"; | ||||
|     case TokenType._else: | ||||
|       return "else"; | ||||
|     case TokenType._finally: | ||||
|       return "finally"; | ||||
|     case TokenType._for: | ||||
|       return "for"; | ||||
|     case TokenType._function: | ||||
|       return "function"; | ||||
|     case TokenType._if: | ||||
|       return "if"; | ||||
|     case TokenType._return: | ||||
|       return "return"; | ||||
|     case TokenType._switch: | ||||
|       return "switch"; | ||||
|     case TokenType._throw: | ||||
|       return "throw"; | ||||
|     case TokenType._try: | ||||
|       return "try"; | ||||
|     case TokenType._var: | ||||
|       return "var"; | ||||
|     case TokenType._let: | ||||
|       return "let"; | ||||
|     case TokenType._const: | ||||
|       return "const"; | ||||
|     case TokenType._while: | ||||
|       return "while"; | ||||
|     case TokenType._with: | ||||
|       return "with"; | ||||
|     case TokenType._new: | ||||
|       return "new"; | ||||
|     case TokenType._this: | ||||
|       return "this"; | ||||
|     case TokenType._super: | ||||
|       return "super"; | ||||
|     case TokenType._class: | ||||
|       return "class"; | ||||
|     case TokenType._extends: | ||||
|       return "extends"; | ||||
|     case TokenType._export: | ||||
|       return "export"; | ||||
|     case TokenType._import: | ||||
|       return "import"; | ||||
|     case TokenType._yield: | ||||
|       return "yield"; | ||||
|     case TokenType._null: | ||||
|       return "null"; | ||||
|     case TokenType._true: | ||||
|       return "true"; | ||||
|     case TokenType._false: | ||||
|       return "false"; | ||||
|     case TokenType._in: | ||||
|       return "in"; | ||||
|     case TokenType._instanceof: | ||||
|       return "instanceof"; | ||||
|     case TokenType._typeof: | ||||
|       return "typeof"; | ||||
|     case TokenType._void: | ||||
|       return "void"; | ||||
|     case TokenType._delete: | ||||
|       return "delete"; | ||||
|     case TokenType._async: | ||||
|       return "async"; | ||||
|     case TokenType._get: | ||||
|       return "get"; | ||||
|     case TokenType._set: | ||||
|       return "set"; | ||||
|     case TokenType._declare: | ||||
|       return "declare"; | ||||
|     case TokenType._readonly: | ||||
|       return "readonly"; | ||||
|     case TokenType._abstract: | ||||
|       return "abstract"; | ||||
|     case TokenType._static: | ||||
|       return "static"; | ||||
|     case TokenType._public: | ||||
|       return "public"; | ||||
|     case TokenType._private: | ||||
|       return "private"; | ||||
|     case TokenType._protected: | ||||
|       return "protected"; | ||||
|     case TokenType._override: | ||||
|       return "override"; | ||||
|     case TokenType._as: | ||||
|       return "as"; | ||||
|     case TokenType._enum: | ||||
|       return "enum"; | ||||
|     case TokenType._type: | ||||
|       return "type"; | ||||
|     case TokenType._implements: | ||||
|       return "implements"; | ||||
|     default: | ||||
|       return ""; | ||||
|   } | ||||
| } | ||||
							
								
								
									
										60
									
								
								node_modules/sucrase/dist/esm/parser/traverser/base.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										60
									
								
								node_modules/sucrase/dist/esm/parser/traverser/base.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,60 @@ | |||
| import State from "../tokenizer/state"; | ||||
| import {charCodes} from "../util/charcodes"; | ||||
| 
 | ||||
| export let isJSXEnabled; | ||||
| export let isTypeScriptEnabled; | ||||
| export let isFlowEnabled; | ||||
| export let state; | ||||
| export let input; | ||||
| export let nextContextId; | ||||
| 
 | ||||
| export function getNextContextId() { | ||||
|   return nextContextId++; | ||||
| } | ||||
| 
 | ||||
| // eslint-disable-next-line @typescript-eslint/no-explicit-any
 | ||||
| export function augmentError(error) { | ||||
|   if ("pos" in error) { | ||||
|     const loc = locationForIndex(error.pos); | ||||
|     error.message += ` (${loc.line}:${loc.column})`; | ||||
|     error.loc = loc; | ||||
|   } | ||||
|   return error; | ||||
| } | ||||
| 
 | ||||
| export class Loc { | ||||
|    | ||||
|    | ||||
|   constructor(line, column) { | ||||
|     this.line = line; | ||||
|     this.column = column; | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| export function locationForIndex(pos) { | ||||
|   let line = 1; | ||||
|   let column = 1; | ||||
|   for (let i = 0; i < pos; i++) { | ||||
|     if (input.charCodeAt(i) === charCodes.lineFeed) { | ||||
|       line++; | ||||
|       column = 1; | ||||
|     } else { | ||||
|       column++; | ||||
|     } | ||||
|   } | ||||
|   return new Loc(line, column); | ||||
| } | ||||
| 
 | ||||
| export function initParser( | ||||
|   inputCode, | ||||
|   isJSXEnabledArg, | ||||
|   isTypeScriptEnabledArg, | ||||
|   isFlowEnabledArg, | ||||
| ) { | ||||
|   input = inputCode; | ||||
|   state = new State(); | ||||
|   nextContextId = 1; | ||||
|   isJSXEnabled = isJSXEnabledArg; | ||||
|   isTypeScriptEnabled = isTypeScriptEnabledArg; | ||||
|   isFlowEnabled = isFlowEnabledArg; | ||||
| } | ||||
							
								
								
									
										1022
									
								
								node_modules/sucrase/dist/esm/parser/traverser/expression.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										1022
									
								
								node_modules/sucrase/dist/esm/parser/traverser/expression.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
										
											
												File diff suppressed because it is too large
												Load diff
											
										
									
								
							
							
								
								
									
										18
									
								
								node_modules/sucrase/dist/esm/parser/traverser/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										18
									
								
								node_modules/sucrase/dist/esm/parser/traverser/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,18 @@ | |||
| 
 | ||||
| import {nextToken, skipLineComment} from "../tokenizer/index"; | ||||
| import {charCodes} from "../util/charcodes"; | ||||
| import {input, state} from "./base"; | ||||
| import {parseTopLevel} from "./statement"; | ||||
| 
 | ||||
| export function parseFile() { | ||||
|   // If enabled, skip leading hashbang line.
 | ||||
|   if ( | ||||
|     state.pos === 0 && | ||||
|     input.charCodeAt(0) === charCodes.numberSign && | ||||
|     input.charCodeAt(1) === charCodes.exclamationMark | ||||
|   ) { | ||||
|     skipLineComment(2); | ||||
|   } | ||||
|   nextToken(); | ||||
|   return parseTopLevel(); | ||||
| } | ||||
							
								
								
									
										159
									
								
								node_modules/sucrase/dist/esm/parser/traverser/lval.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										159
									
								
								node_modules/sucrase/dist/esm/parser/traverser/lval.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,159 @@ | |||
| import {flowParseAssignableListItemTypes} from "../plugins/flow"; | ||||
| import {tsParseAssignableListItemTypes, tsParseModifiers} from "../plugins/typescript"; | ||||
| import { | ||||
|   eat, | ||||
|   IdentifierRole, | ||||
|   match, | ||||
|   next, | ||||
|   popTypeContext, | ||||
|   pushTypeContext, | ||||
| } from "../tokenizer/index"; | ||||
| import {ContextualKeyword} from "../tokenizer/keywords"; | ||||
| import {TokenType, TokenType as tt} from "../tokenizer/types"; | ||||
| import {isFlowEnabled, isTypeScriptEnabled, state} from "./base"; | ||||
| import {parseIdentifier, parseMaybeAssign, parseObj} from "./expression"; | ||||
| import {expect, unexpected} from "./util"; | ||||
| 
 | ||||
| export function parseSpread() { | ||||
|   next(); | ||||
|   parseMaybeAssign(false); | ||||
| } | ||||
| 
 | ||||
| export function parseRest(isBlockScope) { | ||||
|   next(); | ||||
|   parseBindingAtom(isBlockScope); | ||||
| } | ||||
| 
 | ||||
| export function parseBindingIdentifier(isBlockScope) { | ||||
|   parseIdentifier(); | ||||
|   markPriorBindingIdentifier(isBlockScope); | ||||
| } | ||||
| 
 | ||||
| export function parseImportedIdentifier() { | ||||
|   parseIdentifier(); | ||||
|   state.tokens[state.tokens.length - 1].identifierRole = IdentifierRole.ImportDeclaration; | ||||
| } | ||||
| 
 | ||||
| export function markPriorBindingIdentifier(isBlockScope) { | ||||
|   let identifierRole; | ||||
|   if (state.scopeDepth === 0) { | ||||
|     identifierRole = IdentifierRole.TopLevelDeclaration; | ||||
|   } else if (isBlockScope) { | ||||
|     identifierRole = IdentifierRole.BlockScopedDeclaration; | ||||
|   } else { | ||||
|     identifierRole = IdentifierRole.FunctionScopedDeclaration; | ||||
|   } | ||||
|   state.tokens[state.tokens.length - 1].identifierRole = identifierRole; | ||||
| } | ||||
| 
 | ||||
| // Parses lvalue (assignable) atom.
 | ||||
| export function parseBindingAtom(isBlockScope) { | ||||
|   switch (state.type) { | ||||
|     case tt._this: { | ||||
|       // In TypeScript, "this" may be the name of a parameter, so allow it.
 | ||||
|       const oldIsType = pushTypeContext(0); | ||||
|       next(); | ||||
|       popTypeContext(oldIsType); | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     case tt._yield: | ||||
|     case tt.name: { | ||||
|       state.type = tt.name; | ||||
|       parseBindingIdentifier(isBlockScope); | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     case tt.bracketL: { | ||||
|       next(); | ||||
|       parseBindingList(tt.bracketR, isBlockScope, true /* allowEmpty */); | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     case tt.braceL: | ||||
|       parseObj(true, isBlockScope); | ||||
|       return; | ||||
| 
 | ||||
|     default: | ||||
|       unexpected(); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| export function parseBindingList( | ||||
|   close, | ||||
|   isBlockScope, | ||||
|   allowEmpty = false, | ||||
|   allowModifiers = false, | ||||
|   contextId = 0, | ||||
| ) { | ||||
|   let first = true; | ||||
| 
 | ||||
|   let hasRemovedComma = false; | ||||
|   const firstItemTokenIndex = state.tokens.length; | ||||
| 
 | ||||
|   while (!eat(close) && !state.error) { | ||||
|     if (first) { | ||||
|       first = false; | ||||
|     } else { | ||||
|       expect(tt.comma); | ||||
|       state.tokens[state.tokens.length - 1].contextId = contextId; | ||||
|       // After a "this" type in TypeScript, we need to set the following comma (if any) to also be
 | ||||
|       // a type token so that it will be removed.
 | ||||
|       if (!hasRemovedComma && state.tokens[firstItemTokenIndex].isType) { | ||||
|         state.tokens[state.tokens.length - 1].isType = true; | ||||
|         hasRemovedComma = true; | ||||
|       } | ||||
|     } | ||||
|     if (allowEmpty && match(tt.comma)) { | ||||
|       // Empty item; nothing further to parse for this item.
 | ||||
|     } else if (eat(close)) { | ||||
|       break; | ||||
|     } else if (match(tt.ellipsis)) { | ||||
|       parseRest(isBlockScope); | ||||
|       parseAssignableListItemTypes(); | ||||
|       // Support rest element trailing commas allowed by TypeScript <2.9.
 | ||||
|       eat(TokenType.comma); | ||||
|       expect(close); | ||||
|       break; | ||||
|     } else { | ||||
|       parseAssignableListItem(allowModifiers, isBlockScope); | ||||
|     } | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| function parseAssignableListItem(allowModifiers, isBlockScope) { | ||||
|   if (allowModifiers) { | ||||
|     tsParseModifiers([ | ||||
|       ContextualKeyword._public, | ||||
|       ContextualKeyword._protected, | ||||
|       ContextualKeyword._private, | ||||
|       ContextualKeyword._readonly, | ||||
|       ContextualKeyword._override, | ||||
|     ]); | ||||
|   } | ||||
| 
 | ||||
|   parseMaybeDefault(isBlockScope); | ||||
|   parseAssignableListItemTypes(); | ||||
|   parseMaybeDefault(isBlockScope, true /* leftAlreadyParsed */); | ||||
| } | ||||
| 
 | ||||
| function parseAssignableListItemTypes() { | ||||
|   if (isFlowEnabled) { | ||||
|     flowParseAssignableListItemTypes(); | ||||
|   } else if (isTypeScriptEnabled) { | ||||
|     tsParseAssignableListItemTypes(); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| // Parses assignment pattern around given atom if possible.
 | ||||
| export function parseMaybeDefault(isBlockScope, leftAlreadyParsed = false) { | ||||
|   if (!leftAlreadyParsed) { | ||||
|     parseBindingAtom(isBlockScope); | ||||
|   } | ||||
|   if (!eat(tt.eq)) { | ||||
|     return; | ||||
|   } | ||||
|   const eqIndex = state.tokens.length - 1; | ||||
|   parseMaybeAssign(); | ||||
|   state.tokens[eqIndex].rhsEndIndex = state.tokens.length; | ||||
| } | ||||
							
								
								
									
										1332
									
								
								node_modules/sucrase/dist/esm/parser/traverser/statement.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										1332
									
								
								node_modules/sucrase/dist/esm/parser/traverser/statement.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
										
											
												File diff suppressed because it is too large
												Load diff
											
										
									
								
							
							
								
								
									
										104
									
								
								node_modules/sucrase/dist/esm/parser/traverser/util.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										104
									
								
								node_modules/sucrase/dist/esm/parser/traverser/util.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,104 @@ | |||
| import {eat, finishToken, lookaheadTypeAndKeyword, match, nextTokenStart} from "../tokenizer/index"; | ||||
| 
 | ||||
| import {formatTokenType, TokenType as tt} from "../tokenizer/types"; | ||||
| import {charCodes} from "../util/charcodes"; | ||||
| import {input, state} from "./base"; | ||||
| 
 | ||||
| // ## Parser utilities
 | ||||
| 
 | ||||
| // Tests whether parsed token is a contextual keyword.
 | ||||
| export function isContextual(contextualKeyword) { | ||||
|   return state.contextualKeyword === contextualKeyword; | ||||
| } | ||||
| 
 | ||||
| export function isLookaheadContextual(contextualKeyword) { | ||||
|   const l = lookaheadTypeAndKeyword(); | ||||
|   return l.type === tt.name && l.contextualKeyword === contextualKeyword; | ||||
| } | ||||
| 
 | ||||
| // Consumes contextual keyword if possible.
 | ||||
| export function eatContextual(contextualKeyword) { | ||||
|   return state.contextualKeyword === contextualKeyword && eat(tt.name); | ||||
| } | ||||
| 
 | ||||
| // Asserts that following token is given contextual keyword.
 | ||||
| export function expectContextual(contextualKeyword) { | ||||
|   if (!eatContextual(contextualKeyword)) { | ||||
|     unexpected(); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| // Test whether a semicolon can be inserted at the current position.
 | ||||
| export function canInsertSemicolon() { | ||||
|   return match(tt.eof) || match(tt.braceR) || hasPrecedingLineBreak(); | ||||
| } | ||||
| 
 | ||||
| export function hasPrecedingLineBreak() { | ||||
|   const prevToken = state.tokens[state.tokens.length - 1]; | ||||
|   const lastTokEnd = prevToken ? prevToken.end : 0; | ||||
|   for (let i = lastTokEnd; i < state.start; i++) { | ||||
|     const code = input.charCodeAt(i); | ||||
|     if ( | ||||
|       code === charCodes.lineFeed || | ||||
|       code === charCodes.carriageReturn || | ||||
|       code === 0x2028 || | ||||
|       code === 0x2029 | ||||
|     ) { | ||||
|       return true; | ||||
|     } | ||||
|   } | ||||
|   return false; | ||||
| } | ||||
| 
 | ||||
| export function hasFollowingLineBreak() { | ||||
|   const nextStart = nextTokenStart(); | ||||
|   for (let i = state.end; i < nextStart; i++) { | ||||
|     const code = input.charCodeAt(i); | ||||
|     if ( | ||||
|       code === charCodes.lineFeed || | ||||
|       code === charCodes.carriageReturn || | ||||
|       code === 0x2028 || | ||||
|       code === 0x2029 | ||||
|     ) { | ||||
|       return true; | ||||
|     } | ||||
|   } | ||||
|   return false; | ||||
| } | ||||
| 
 | ||||
| export function isLineTerminator() { | ||||
|   return eat(tt.semi) || canInsertSemicolon(); | ||||
| } | ||||
| 
 | ||||
| // Consume a semicolon, or, failing that, see if we are allowed to
 | ||||
| // pretend that there is a semicolon at this position.
 | ||||
| export function semicolon() { | ||||
|   if (!isLineTerminator()) { | ||||
|     unexpected('Unexpected token, expected ";"'); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| // Expect a token of a given type. If found, consume it, otherwise,
 | ||||
| // raise an unexpected token error at given pos.
 | ||||
| export function expect(type) { | ||||
|   const matched = eat(type); | ||||
|   if (!matched) { | ||||
|     unexpected(`Unexpected token, expected "${formatTokenType(type)}"`); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Transition the parser to an error state. All code needs to be written to naturally unwind in this | ||||
|  * state, which allows us to backtrack without exceptions and without error plumbing everywhere. | ||||
|  */ | ||||
| export function unexpected(message = "Unexpected token", pos = state.start) { | ||||
|   if (state.error) { | ||||
|     return; | ||||
|   } | ||||
|   // eslint-disable-next-line @typescript-eslint/no-explicit-any
 | ||||
|   const err = new SyntaxError(message); | ||||
|   err.pos = pos; | ||||
|   state.error = err; | ||||
|   state.pos = input.length; | ||||
|   finishToken(tt.eof); | ||||
| } | ||||
							
								
								
									
										115
									
								
								node_modules/sucrase/dist/esm/parser/util/charcodes.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										115
									
								
								node_modules/sucrase/dist/esm/parser/util/charcodes.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,115 @@ | |||
| export var charCodes; (function (charCodes) { | ||||
|   const backSpace = 8; charCodes[charCodes["backSpace"] = backSpace] = "backSpace"; | ||||
|   const lineFeed = 10; charCodes[charCodes["lineFeed"] = lineFeed] = "lineFeed"; //  '\n'
 | ||||
|   const tab = 9; charCodes[charCodes["tab"] = tab] = "tab"; //  '\t'
 | ||||
|   const carriageReturn = 13; charCodes[charCodes["carriageReturn"] = carriageReturn] = "carriageReturn"; //  '\r'
 | ||||
|   const shiftOut = 14; charCodes[charCodes["shiftOut"] = shiftOut] = "shiftOut"; | ||||
|   const space = 32; charCodes[charCodes["space"] = space] = "space"; | ||||
|   const exclamationMark = 33; charCodes[charCodes["exclamationMark"] = exclamationMark] = "exclamationMark"; //  '!'
 | ||||
|   const quotationMark = 34; charCodes[charCodes["quotationMark"] = quotationMark] = "quotationMark"; //  '"'
 | ||||
|   const numberSign = 35; charCodes[charCodes["numberSign"] = numberSign] = "numberSign"; //  '#'
 | ||||
|   const dollarSign = 36; charCodes[charCodes["dollarSign"] = dollarSign] = "dollarSign"; //  '$'
 | ||||
|   const percentSign = 37; charCodes[charCodes["percentSign"] = percentSign] = "percentSign"; //  '%'
 | ||||
|   const ampersand = 38; charCodes[charCodes["ampersand"] = ampersand] = "ampersand"; //  '&'
 | ||||
|   const apostrophe = 39; charCodes[charCodes["apostrophe"] = apostrophe] = "apostrophe"; //  '''
 | ||||
|   const leftParenthesis = 40; charCodes[charCodes["leftParenthesis"] = leftParenthesis] = "leftParenthesis"; //  '('
 | ||||
|   const rightParenthesis = 41; charCodes[charCodes["rightParenthesis"] = rightParenthesis] = "rightParenthesis"; //  ')'
 | ||||
|   const asterisk = 42; charCodes[charCodes["asterisk"] = asterisk] = "asterisk"; //  '*'
 | ||||
|   const plusSign = 43; charCodes[charCodes["plusSign"] = plusSign] = "plusSign"; //  '+'
 | ||||
|   const comma = 44; charCodes[charCodes["comma"] = comma] = "comma"; //  ','
 | ||||
|   const dash = 45; charCodes[charCodes["dash"] = dash] = "dash"; //  '-'
 | ||||
|   const dot = 46; charCodes[charCodes["dot"] = dot] = "dot"; //  '.'
 | ||||
|   const slash = 47; charCodes[charCodes["slash"] = slash] = "slash"; //  '/'
 | ||||
|   const digit0 = 48; charCodes[charCodes["digit0"] = digit0] = "digit0"; //  '0'
 | ||||
|   const digit1 = 49; charCodes[charCodes["digit1"] = digit1] = "digit1"; //  '1'
 | ||||
|   const digit2 = 50; charCodes[charCodes["digit2"] = digit2] = "digit2"; //  '2'
 | ||||
|   const digit3 = 51; charCodes[charCodes["digit3"] = digit3] = "digit3"; //  '3'
 | ||||
|   const digit4 = 52; charCodes[charCodes["digit4"] = digit4] = "digit4"; //  '4'
 | ||||
|   const digit5 = 53; charCodes[charCodes["digit5"] = digit5] = "digit5"; //  '5'
 | ||||
|   const digit6 = 54; charCodes[charCodes["digit6"] = digit6] = "digit6"; //  '6'
 | ||||
|   const digit7 = 55; charCodes[charCodes["digit7"] = digit7] = "digit7"; //  '7'
 | ||||
|   const digit8 = 56; charCodes[charCodes["digit8"] = digit8] = "digit8"; //  '8'
 | ||||
|   const digit9 = 57; charCodes[charCodes["digit9"] = digit9] = "digit9"; //  '9'
 | ||||
|   const colon = 58; charCodes[charCodes["colon"] = colon] = "colon"; //  ':'
 | ||||
|   const semicolon = 59; charCodes[charCodes["semicolon"] = semicolon] = "semicolon"; //  ';'
 | ||||
|   const lessThan = 60; charCodes[charCodes["lessThan"] = lessThan] = "lessThan"; //  '<'
 | ||||
|   const equalsTo = 61; charCodes[charCodes["equalsTo"] = equalsTo] = "equalsTo"; //  '='
 | ||||
|   const greaterThan = 62; charCodes[charCodes["greaterThan"] = greaterThan] = "greaterThan"; //  '>'
 | ||||
|   const questionMark = 63; charCodes[charCodes["questionMark"] = questionMark] = "questionMark"; //  '?'
 | ||||
|   const atSign = 64; charCodes[charCodes["atSign"] = atSign] = "atSign"; //  '@'
 | ||||
|   const uppercaseA = 65; charCodes[charCodes["uppercaseA"] = uppercaseA] = "uppercaseA"; //  'A'
 | ||||
|   const uppercaseB = 66; charCodes[charCodes["uppercaseB"] = uppercaseB] = "uppercaseB"; //  'B'
 | ||||
|   const uppercaseC = 67; charCodes[charCodes["uppercaseC"] = uppercaseC] = "uppercaseC"; //  'C'
 | ||||
|   const uppercaseD = 68; charCodes[charCodes["uppercaseD"] = uppercaseD] = "uppercaseD"; //  'D'
 | ||||
|   const uppercaseE = 69; charCodes[charCodes["uppercaseE"] = uppercaseE] = "uppercaseE"; //  'E'
 | ||||
|   const uppercaseF = 70; charCodes[charCodes["uppercaseF"] = uppercaseF] = "uppercaseF"; //  'F'
 | ||||
|   const uppercaseG = 71; charCodes[charCodes["uppercaseG"] = uppercaseG] = "uppercaseG"; //  'G'
 | ||||
|   const uppercaseH = 72; charCodes[charCodes["uppercaseH"] = uppercaseH] = "uppercaseH"; //  'H'
 | ||||
|   const uppercaseI = 73; charCodes[charCodes["uppercaseI"] = uppercaseI] = "uppercaseI"; //  'I'
 | ||||
|   const uppercaseJ = 74; charCodes[charCodes["uppercaseJ"] = uppercaseJ] = "uppercaseJ"; //  'J'
 | ||||
|   const uppercaseK = 75; charCodes[charCodes["uppercaseK"] = uppercaseK] = "uppercaseK"; //  'K'
 | ||||
|   const uppercaseL = 76; charCodes[charCodes["uppercaseL"] = uppercaseL] = "uppercaseL"; //  'L'
 | ||||
|   const uppercaseM = 77; charCodes[charCodes["uppercaseM"] = uppercaseM] = "uppercaseM"; //  'M'
 | ||||
|   const uppercaseN = 78; charCodes[charCodes["uppercaseN"] = uppercaseN] = "uppercaseN"; //  'N'
 | ||||
|   const uppercaseO = 79; charCodes[charCodes["uppercaseO"] = uppercaseO] = "uppercaseO"; //  'O'
 | ||||
|   const uppercaseP = 80; charCodes[charCodes["uppercaseP"] = uppercaseP] = "uppercaseP"; //  'P'
 | ||||
|   const uppercaseQ = 81; charCodes[charCodes["uppercaseQ"] = uppercaseQ] = "uppercaseQ"; //  'Q'
 | ||||
|   const uppercaseR = 82; charCodes[charCodes["uppercaseR"] = uppercaseR] = "uppercaseR"; //  'R'
 | ||||
|   const uppercaseS = 83; charCodes[charCodes["uppercaseS"] = uppercaseS] = "uppercaseS"; //  'S'
 | ||||
|   const uppercaseT = 84; charCodes[charCodes["uppercaseT"] = uppercaseT] = "uppercaseT"; //  'T'
 | ||||
|   const uppercaseU = 85; charCodes[charCodes["uppercaseU"] = uppercaseU] = "uppercaseU"; //  'U'
 | ||||
|   const uppercaseV = 86; charCodes[charCodes["uppercaseV"] = uppercaseV] = "uppercaseV"; //  'V'
 | ||||
|   const uppercaseW = 87; charCodes[charCodes["uppercaseW"] = uppercaseW] = "uppercaseW"; //  'W'
 | ||||
|   const uppercaseX = 88; charCodes[charCodes["uppercaseX"] = uppercaseX] = "uppercaseX"; //  'X'
 | ||||
|   const uppercaseY = 89; charCodes[charCodes["uppercaseY"] = uppercaseY] = "uppercaseY"; //  'Y'
 | ||||
|   const uppercaseZ = 90; charCodes[charCodes["uppercaseZ"] = uppercaseZ] = "uppercaseZ"; //  'Z'
 | ||||
|   const leftSquareBracket = 91; charCodes[charCodes["leftSquareBracket"] = leftSquareBracket] = "leftSquareBracket"; //  '['
 | ||||
|   const backslash = 92; charCodes[charCodes["backslash"] = backslash] = "backslash"; //  '\    '
 | ||||
|   const rightSquareBracket = 93; charCodes[charCodes["rightSquareBracket"] = rightSquareBracket] = "rightSquareBracket"; //  ']'
 | ||||
|   const caret = 94; charCodes[charCodes["caret"] = caret] = "caret"; //  '^'
 | ||||
|   const underscore = 95; charCodes[charCodes["underscore"] = underscore] = "underscore"; //  '_'
 | ||||
|   const graveAccent = 96; charCodes[charCodes["graveAccent"] = graveAccent] = "graveAccent"; //  '`'
 | ||||
|   const lowercaseA = 97; charCodes[charCodes["lowercaseA"] = lowercaseA] = "lowercaseA"; //  'a'
 | ||||
|   const lowercaseB = 98; charCodes[charCodes["lowercaseB"] = lowercaseB] = "lowercaseB"; //  'b'
 | ||||
|   const lowercaseC = 99; charCodes[charCodes["lowercaseC"] = lowercaseC] = "lowercaseC"; //  'c'
 | ||||
|   const lowercaseD = 100; charCodes[charCodes["lowercaseD"] = lowercaseD] = "lowercaseD"; //  'd'
 | ||||
|   const lowercaseE = 101; charCodes[charCodes["lowercaseE"] = lowercaseE] = "lowercaseE"; //  'e'
 | ||||
|   const lowercaseF = 102; charCodes[charCodes["lowercaseF"] = lowercaseF] = "lowercaseF"; //  'f'
 | ||||
|   const lowercaseG = 103; charCodes[charCodes["lowercaseG"] = lowercaseG] = "lowercaseG"; //  'g'
 | ||||
|   const lowercaseH = 104; charCodes[charCodes["lowercaseH"] = lowercaseH] = "lowercaseH"; //  'h'
 | ||||
|   const lowercaseI = 105; charCodes[charCodes["lowercaseI"] = lowercaseI] = "lowercaseI"; //  'i'
 | ||||
|   const lowercaseJ = 106; charCodes[charCodes["lowercaseJ"] = lowercaseJ] = "lowercaseJ"; //  'j'
 | ||||
|   const lowercaseK = 107; charCodes[charCodes["lowercaseK"] = lowercaseK] = "lowercaseK"; //  'k'
 | ||||
|   const lowercaseL = 108; charCodes[charCodes["lowercaseL"] = lowercaseL] = "lowercaseL"; //  'l'
 | ||||
|   const lowercaseM = 109; charCodes[charCodes["lowercaseM"] = lowercaseM] = "lowercaseM"; //  'm'
 | ||||
|   const lowercaseN = 110; charCodes[charCodes["lowercaseN"] = lowercaseN] = "lowercaseN"; //  'n'
 | ||||
|   const lowercaseO = 111; charCodes[charCodes["lowercaseO"] = lowercaseO] = "lowercaseO"; //  'o'
 | ||||
|   const lowercaseP = 112; charCodes[charCodes["lowercaseP"] = lowercaseP] = "lowercaseP"; //  'p'
 | ||||
|   const lowercaseQ = 113; charCodes[charCodes["lowercaseQ"] = lowercaseQ] = "lowercaseQ"; //  'q'
 | ||||
|   const lowercaseR = 114; charCodes[charCodes["lowercaseR"] = lowercaseR] = "lowercaseR"; //  'r'
 | ||||
|   const lowercaseS = 115; charCodes[charCodes["lowercaseS"] = lowercaseS] = "lowercaseS"; //  's'
 | ||||
|   const lowercaseT = 116; charCodes[charCodes["lowercaseT"] = lowercaseT] = "lowercaseT"; //  't'
 | ||||
|   const lowercaseU = 117; charCodes[charCodes["lowercaseU"] = lowercaseU] = "lowercaseU"; //  'u'
 | ||||
|   const lowercaseV = 118; charCodes[charCodes["lowercaseV"] = lowercaseV] = "lowercaseV"; //  'v'
 | ||||
|   const lowercaseW = 119; charCodes[charCodes["lowercaseW"] = lowercaseW] = "lowercaseW"; //  'w'
 | ||||
|   const lowercaseX = 120; charCodes[charCodes["lowercaseX"] = lowercaseX] = "lowercaseX"; //  'x'
 | ||||
|   const lowercaseY = 121; charCodes[charCodes["lowercaseY"] = lowercaseY] = "lowercaseY"; //  'y'
 | ||||
|   const lowercaseZ = 122; charCodes[charCodes["lowercaseZ"] = lowercaseZ] = "lowercaseZ"; //  'z'
 | ||||
|   const leftCurlyBrace = 123; charCodes[charCodes["leftCurlyBrace"] = leftCurlyBrace] = "leftCurlyBrace"; //  '{'
 | ||||
|   const verticalBar = 124; charCodes[charCodes["verticalBar"] = verticalBar] = "verticalBar"; //  '|'
 | ||||
|   const rightCurlyBrace = 125; charCodes[charCodes["rightCurlyBrace"] = rightCurlyBrace] = "rightCurlyBrace"; //  '}'
 | ||||
|   const tilde = 126; charCodes[charCodes["tilde"] = tilde] = "tilde"; //  '~'
 | ||||
|   const nonBreakingSpace = 160; charCodes[charCodes["nonBreakingSpace"] = nonBreakingSpace] = "nonBreakingSpace"; | ||||
|   // eslint-disable-next-line no-irregular-whitespace
 | ||||
|   const oghamSpaceMark = 5760; charCodes[charCodes["oghamSpaceMark"] = oghamSpaceMark] = "oghamSpaceMark"; // ' '
 | ||||
|   const lineSeparator = 8232; charCodes[charCodes["lineSeparator"] = lineSeparator] = "lineSeparator"; | ||||
|   const paragraphSeparator = 8233; charCodes[charCodes["paragraphSeparator"] = paragraphSeparator] = "paragraphSeparator"; | ||||
| })(charCodes || (charCodes = {})); | ||||
| 
 | ||||
| export function isDigit(code) { | ||||
|   return ( | ||||
|     (code >= charCodes.digit0 && code <= charCodes.digit9) || | ||||
|     (code >= charCodes.lowercaseA && code <= charCodes.lowercaseF) || | ||||
|     (code >= charCodes.uppercaseA && code <= charCodes.uppercaseF) | ||||
|   ); | ||||
| } | ||||
							
								
								
									
										34
									
								
								node_modules/sucrase/dist/esm/parser/util/identifier.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										34
									
								
								node_modules/sucrase/dist/esm/parser/util/identifier.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,34 @@ | |||
| import {charCodes} from "./charcodes"; | ||||
| import {WHITESPACE_CHARS} from "./whitespace"; | ||||
| 
 | ||||
| function computeIsIdentifierChar(code) { | ||||
|   if (code < 48) return code === 36; | ||||
|   if (code < 58) return true; | ||||
|   if (code < 65) return false; | ||||
|   if (code < 91) return true; | ||||
|   if (code < 97) return code === 95; | ||||
|   if (code < 123) return true; | ||||
|   if (code < 128) return false; | ||||
|   throw new Error("Should not be called with non-ASCII char code."); | ||||
| } | ||||
| 
 | ||||
| export const IS_IDENTIFIER_CHAR = new Uint8Array(65536); | ||||
| for (let i = 0; i < 128; i++) { | ||||
|   IS_IDENTIFIER_CHAR[i] = computeIsIdentifierChar(i) ? 1 : 0; | ||||
| } | ||||
| for (let i = 128; i < 65536; i++) { | ||||
|   IS_IDENTIFIER_CHAR[i] = 1; | ||||
| } | ||||
| // Aside from whitespace and newlines, all characters outside the ASCII space are either
 | ||||
| // identifier characters or invalid. Since we're not performing code validation, we can just
 | ||||
| // treat all invalid characters as identifier characters.
 | ||||
| for (const whitespaceChar of WHITESPACE_CHARS) { | ||||
|   IS_IDENTIFIER_CHAR[whitespaceChar] = 0; | ||||
| } | ||||
| IS_IDENTIFIER_CHAR[0x2028] = 0; | ||||
| IS_IDENTIFIER_CHAR[0x2029] = 0; | ||||
| 
 | ||||
| export const IS_IDENTIFIER_START = IS_IDENTIFIER_CHAR.slice(); | ||||
| for (let numChar = charCodes.digit0; numChar <= charCodes.digit9; numChar++) { | ||||
|   IS_IDENTIFIER_START[numChar] = 0; | ||||
| } | ||||
							
								
								
									
										33
									
								
								node_modules/sucrase/dist/esm/parser/util/whitespace.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										33
									
								
								node_modules/sucrase/dist/esm/parser/util/whitespace.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,33 @@ | |||
| import {charCodes} from "./charcodes"; | ||||
| 
 | ||||
| // https://tc39.github.io/ecma262/#sec-white-space
 | ||||
| export const WHITESPACE_CHARS = [ | ||||
|   0x0009, | ||||
|   0x000b, | ||||
|   0x000c, | ||||
|   charCodes.space, | ||||
|   charCodes.nonBreakingSpace, | ||||
|   charCodes.oghamSpaceMark, | ||||
|   0x2000, // EN QUAD
 | ||||
|   0x2001, // EM QUAD
 | ||||
|   0x2002, // EN SPACE
 | ||||
|   0x2003, // EM SPACE
 | ||||
|   0x2004, // THREE-PER-EM SPACE
 | ||||
|   0x2005, // FOUR-PER-EM SPACE
 | ||||
|   0x2006, // SIX-PER-EM SPACE
 | ||||
|   0x2007, // FIGURE SPACE
 | ||||
|   0x2008, // PUNCTUATION SPACE
 | ||||
|   0x2009, // THIN SPACE
 | ||||
|   0x200a, // HAIR SPACE
 | ||||
|   0x202f, // NARROW NO-BREAK SPACE
 | ||||
|   0x205f, // MEDIUM MATHEMATICAL SPACE
 | ||||
|   0x3000, // IDEOGRAPHIC SPACE
 | ||||
|   0xfeff, // ZERO WIDTH NO-BREAK SPACE
 | ||||
| ]; | ||||
| 
 | ||||
| export const skipWhiteSpace = /(?:\s|\/\/.*|\/\*[^]*?\*\/)*/g; | ||||
| 
 | ||||
| export const IS_WHITESPACE = new Uint8Array(65536); | ||||
| for (const char of WHITESPACE_CHARS) { | ||||
|   IS_WHITESPACE[char] = 1; | ||||
| } | ||||
							
								
								
									
										88
									
								
								node_modules/sucrase/dist/esm/register.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										88
									
								
								node_modules/sucrase/dist/esm/register.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,88 @@ | |||
| import * as pirates from "pirates"; | ||||
| 
 | ||||
| import { transform} from "./index"; | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| export function addHook( | ||||
|   extension, | ||||
|   sucraseOptions, | ||||
|   hookOptions, | ||||
| ) { | ||||
|   let mergedSucraseOptions = sucraseOptions; | ||||
|   const sucraseOptionsEnvJSON = process.env.SUCRASE_OPTIONS; | ||||
|   if (sucraseOptionsEnvJSON) { | ||||
|     mergedSucraseOptions = {...mergedSucraseOptions, ...JSON.parse(sucraseOptionsEnvJSON)}; | ||||
|   } | ||||
|   return pirates.addHook( | ||||
|     (code, filePath) => { | ||||
|       const {code: transformedCode, sourceMap} = transform(code, { | ||||
|         ...mergedSucraseOptions, | ||||
|         sourceMapOptions: {compiledFilename: filePath}, | ||||
|         filePath, | ||||
|       }); | ||||
|       const mapBase64 = Buffer.from(JSON.stringify(sourceMap)).toString("base64"); | ||||
|       const suffix = `//# sourceMappingURL=data:application/json;charset=utf-8;base64,${mapBase64}`; | ||||
|       return `${transformedCode}\n${suffix}`; | ||||
|     }, | ||||
|     {...hookOptions, exts: [extension]}, | ||||
|   ); | ||||
| } | ||||
| 
 | ||||
| export function registerJS(hookOptions) { | ||||
|   return addHook(".js", {transforms: ["imports", "flow", "jsx"]}, hookOptions); | ||||
| } | ||||
| 
 | ||||
| export function registerJSX(hookOptions) { | ||||
|   return addHook(".jsx", {transforms: ["imports", "flow", "jsx"]}, hookOptions); | ||||
| } | ||||
| 
 | ||||
| export function registerTS(hookOptions) { | ||||
|   return addHook(".ts", {transforms: ["imports", "typescript"]}, hookOptions); | ||||
| } | ||||
| 
 | ||||
| export function registerTSX(hookOptions) { | ||||
|   return addHook(".tsx", {transforms: ["imports", "typescript", "jsx"]}, hookOptions); | ||||
| } | ||||
| 
 | ||||
| export function registerTSLegacyModuleInterop(hookOptions) { | ||||
|   return addHook( | ||||
|     ".ts", | ||||
|     { | ||||
|       transforms: ["imports", "typescript"], | ||||
|       enableLegacyTypeScriptModuleInterop: true, | ||||
|     }, | ||||
|     hookOptions, | ||||
|   ); | ||||
| } | ||||
| 
 | ||||
| export function registerTSXLegacyModuleInterop(hookOptions) { | ||||
|   return addHook( | ||||
|     ".tsx", | ||||
|     { | ||||
|       transforms: ["imports", "typescript", "jsx"], | ||||
|       enableLegacyTypeScriptModuleInterop: true, | ||||
|     }, | ||||
|     hookOptions, | ||||
|   ); | ||||
| } | ||||
| 
 | ||||
| export function registerAll(hookOptions) { | ||||
|   const reverts = [ | ||||
|     registerJS(hookOptions), | ||||
|     registerJSX(hookOptions), | ||||
|     registerTS(hookOptions), | ||||
|     registerTSX(hookOptions), | ||||
|   ]; | ||||
| 
 | ||||
|   return () => { | ||||
|     for (const fn of reverts) { | ||||
|       fn(); | ||||
|     } | ||||
|   }; | ||||
| } | ||||
							
								
								
									
										916
									
								
								node_modules/sucrase/dist/esm/transformers/CJSImportTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										916
									
								
								node_modules/sucrase/dist/esm/transformers/CJSImportTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,916 @@ | |||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| import {IdentifierRole, isDeclaration, isObjectShorthandDeclaration} from "../parser/tokenizer"; | ||||
| import {ContextualKeyword} from "../parser/tokenizer/keywords"; | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| import elideImportEquals from "../util/elideImportEquals"; | ||||
| import getDeclarationInfo, { | ||||
| 
 | ||||
|   EMPTY_DECLARATION_INFO, | ||||
| } from "../util/getDeclarationInfo"; | ||||
| import getImportExportSpecifierInfo from "../util/getImportExportSpecifierInfo"; | ||||
| import isExportFrom from "../util/isExportFrom"; | ||||
| import {removeMaybeImportAttributes} from "../util/removeMaybeImportAttributes"; | ||||
| import shouldElideDefaultExport from "../util/shouldElideDefaultExport"; | ||||
| 
 | ||||
| 
 | ||||
| import Transformer from "./Transformer"; | ||||
| 
 | ||||
| /** | ||||
|  * Class for editing import statements when we are transforming to commonjs. | ||||
|  */ | ||||
| export default class CJSImportTransformer extends Transformer { | ||||
|    __init() {this.hadExport = false} | ||||
|    __init2() {this.hadNamedExport = false} | ||||
|    __init3() {this.hadDefaultExport = false} | ||||
|    | ||||
| 
 | ||||
|   constructor( | ||||
|      rootTransformer, | ||||
|      tokens, | ||||
|      importProcessor, | ||||
|      nameManager, | ||||
|      helperManager, | ||||
|      reactHotLoaderTransformer, | ||||
|      enableLegacyBabel5ModuleInterop, | ||||
|      enableLegacyTypeScriptModuleInterop, | ||||
|      isTypeScriptTransformEnabled, | ||||
|      isFlowTransformEnabled, | ||||
|      preserveDynamicImport, | ||||
|      keepUnusedImports, | ||||
|   ) { | ||||
|     super();this.rootTransformer = rootTransformer;this.tokens = tokens;this.importProcessor = importProcessor;this.nameManager = nameManager;this.helperManager = helperManager;this.reactHotLoaderTransformer = reactHotLoaderTransformer;this.enableLegacyBabel5ModuleInterop = enableLegacyBabel5ModuleInterop;this.enableLegacyTypeScriptModuleInterop = enableLegacyTypeScriptModuleInterop;this.isTypeScriptTransformEnabled = isTypeScriptTransformEnabled;this.isFlowTransformEnabled = isFlowTransformEnabled;this.preserveDynamicImport = preserveDynamicImport;this.keepUnusedImports = keepUnusedImports;CJSImportTransformer.prototype.__init.call(this);CJSImportTransformer.prototype.__init2.call(this);CJSImportTransformer.prototype.__init3.call(this);; | ||||
|     this.declarationInfo = isTypeScriptTransformEnabled | ||||
|       ? getDeclarationInfo(tokens) | ||||
|       : EMPTY_DECLARATION_INFO; | ||||
|   } | ||||
| 
 | ||||
|   getPrefixCode() { | ||||
|     let prefix = ""; | ||||
|     if (this.hadExport) { | ||||
|       prefix += 'Object.defineProperty(exports, "__esModule", {value: true});'; | ||||
|     } | ||||
|     return prefix; | ||||
|   } | ||||
| 
 | ||||
|   getSuffixCode() { | ||||
|     if (this.enableLegacyBabel5ModuleInterop && this.hadDefaultExport && !this.hadNamedExport) { | ||||
|       return "\nmodule.exports = exports.default;\n"; | ||||
|     } | ||||
|     return ""; | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     // TypeScript `import foo = require('foo');` should always just be translated to plain require.
 | ||||
|     if (this.tokens.matches3(tt._import, tt.name, tt.eq)) { | ||||
|       return this.processImportEquals(); | ||||
|     } | ||||
|     if (this.tokens.matches1(tt._import)) { | ||||
|       this.processImport(); | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches2(tt._export, tt.eq)) { | ||||
|       this.tokens.replaceToken("module.exports"); | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches1(tt._export) && !this.tokens.currentToken().isType) { | ||||
|       this.hadExport = true; | ||||
|       return this.processExport(); | ||||
|     } | ||||
|     if (this.tokens.matches2(tt.name, tt.postIncDec)) { | ||||
|       // Fall through to normal identifier matching if this doesn't apply.
 | ||||
|       if (this.processPostIncDec()) { | ||||
|         return true; | ||||
|       } | ||||
|     } | ||||
|     if (this.tokens.matches1(tt.name) || this.tokens.matches1(tt.jsxName)) { | ||||
|       return this.processIdentifier(); | ||||
|     } | ||||
|     if (this.tokens.matches1(tt.eq)) { | ||||
|       return this.processAssignment(); | ||||
|     } | ||||
|     if (this.tokens.matches1(tt.assign)) { | ||||
|       return this.processComplexAssignment(); | ||||
|     } | ||||
|     if (this.tokens.matches1(tt.preIncDec)) { | ||||
|       return this.processPreIncDec(); | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|    processImportEquals() { | ||||
|     const importName = this.tokens.identifierNameAtIndex(this.tokens.currentIndex() + 1); | ||||
|     if (this.importProcessor.shouldAutomaticallyElideImportedName(importName)) { | ||||
|       // If this name is only used as a type, elide the whole import.
 | ||||
|       elideImportEquals(this.tokens); | ||||
|     } else { | ||||
|       // Otherwise, switch `import` to `const`.
 | ||||
|       this.tokens.replaceToken("const"); | ||||
|     } | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform this: | ||||
|    * import foo, {bar} from 'baz'; | ||||
|    * into | ||||
|    * var _baz = require('baz'); var _baz2 = _interopRequireDefault(_baz); | ||||
|    * | ||||
|    * The import code was already generated in the import preprocessing step, so | ||||
|    * we just need to look it up. | ||||
|    */ | ||||
|    processImport() { | ||||
|     if (this.tokens.matches2(tt._import, tt.parenL)) { | ||||
|       if (this.preserveDynamicImport) { | ||||
|         // Bail out, only making progress for this one token.
 | ||||
|         this.tokens.copyToken(); | ||||
|         return; | ||||
|       } | ||||
|       const requireWrapper = this.enableLegacyTypeScriptModuleInterop | ||||
|         ? "" | ||||
|         : `${this.helperManager.getHelperName("interopRequireWildcard")}(`; | ||||
|       this.tokens.replaceToken(`Promise.resolve().then(() => ${requireWrapper}require`); | ||||
|       const contextId = this.tokens.currentToken().contextId; | ||||
|       if (contextId == null) { | ||||
|         throw new Error("Expected context ID on dynamic import invocation."); | ||||
|       } | ||||
|       this.tokens.copyToken(); | ||||
|       while (!this.tokens.matchesContextIdAndLabel(tt.parenR, contextId)) { | ||||
|         this.rootTransformer.processToken(); | ||||
|       } | ||||
|       this.tokens.replaceToken(requireWrapper ? ")))" : "))"); | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     const shouldElideImport = this.removeImportAndDetectIfShouldElide(); | ||||
|     if (shouldElideImport) { | ||||
|       this.tokens.removeToken(); | ||||
|     } else { | ||||
|       const path = this.tokens.stringValue(); | ||||
|       this.tokens.replaceTokenTrimmingLeftWhitespace(this.importProcessor.claimImportCode(path)); | ||||
|       this.tokens.appendCode(this.importProcessor.claimImportCode(path)); | ||||
|     } | ||||
|     removeMaybeImportAttributes(this.tokens); | ||||
|     if (this.tokens.matches1(tt.semi)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Erase this import (since any CJS output would be completely different), and | ||||
|    * return true if this import is should be elided due to being a type-only | ||||
|    * import. Such imports will not be emitted at all to avoid side effects. | ||||
|    * | ||||
|    * Import elision only happens with the TypeScript or Flow transforms enabled. | ||||
|    * | ||||
|    * TODO: This function has some awkward overlap with | ||||
|    *  CJSImportProcessor.pruneTypeOnlyImports , and the two should be unified. | ||||
|    *  That function handles TypeScript implicit import name elision, and removes | ||||
|    *  an import if all typical imported names (without `type`) are removed due | ||||
|    *  to being type-only imports. This function handles Flow import removal and | ||||
|    *  properly distinguishes `import 'foo'` from `import {} from 'foo'` for TS | ||||
|    *  purposes. | ||||
|    * | ||||
|    * The position should end at the import string. | ||||
|    */ | ||||
|    removeImportAndDetectIfShouldElide() { | ||||
|     this.tokens.removeInitialToken(); | ||||
|     if ( | ||||
|       this.tokens.matchesContextual(ContextualKeyword._type) && | ||||
|       !this.tokens.matches1AtIndex(this.tokens.currentIndex() + 1, tt.comma) && | ||||
|       !this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 1, ContextualKeyword._from) | ||||
|     ) { | ||||
|       // This is an "import type" statement, so exit early.
 | ||||
|       this.removeRemainingImport(); | ||||
|       return true; | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1(tt.name) || this.tokens.matches1(tt.star)) { | ||||
|       // We have a default import or namespace import, so there must be some
 | ||||
|       // non-type import.
 | ||||
|       this.removeRemainingImport(); | ||||
|       return false; | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1(tt.string)) { | ||||
|       // This is a bare import, so we should proceed with the import.
 | ||||
|       return false; | ||||
|     } | ||||
| 
 | ||||
|     let foundNonTypeImport = false; | ||||
|     let foundAnyNamedImport = false; | ||||
|     while (!this.tokens.matches1(tt.string)) { | ||||
|       // Check if any named imports are of the form "foo" or "foo as bar", with
 | ||||
|       // no leading "type".
 | ||||
|       if ( | ||||
|         (!foundNonTypeImport && this.tokens.matches1(tt.braceL)) || | ||||
|         this.tokens.matches1(tt.comma) | ||||
|       ) { | ||||
|         this.tokens.removeToken(); | ||||
|         if (!this.tokens.matches1(tt.braceR)) { | ||||
|           foundAnyNamedImport = true; | ||||
|         } | ||||
|         if ( | ||||
|           this.tokens.matches2(tt.name, tt.comma) || | ||||
|           this.tokens.matches2(tt.name, tt.braceR) || | ||||
|           this.tokens.matches4(tt.name, tt.name, tt.name, tt.comma) || | ||||
|           this.tokens.matches4(tt.name, tt.name, tt.name, tt.braceR) | ||||
|         ) { | ||||
|           foundNonTypeImport = true; | ||||
|         } | ||||
|       } | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|     if (this.keepUnusedImports) { | ||||
|       return false; | ||||
|     } | ||||
|     if (this.isTypeScriptTransformEnabled) { | ||||
|       return !foundNonTypeImport; | ||||
|     } else if (this.isFlowTransformEnabled) { | ||||
|       // In Flow, unlike TS, `import {} from 'foo';` preserves the import.
 | ||||
|       return foundAnyNamedImport && !foundNonTypeImport; | ||||
|     } else { | ||||
|       return false; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    removeRemainingImport() { | ||||
|     while (!this.tokens.matches1(tt.string)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    processIdentifier() { | ||||
|     const token = this.tokens.currentToken(); | ||||
|     if (token.shadowsGlobal) { | ||||
|       return false; | ||||
|     } | ||||
| 
 | ||||
|     if (token.identifierRole === IdentifierRole.ObjectShorthand) { | ||||
|       return this.processObjectShorthand(); | ||||
|     } | ||||
| 
 | ||||
|     if (token.identifierRole !== IdentifierRole.Access) { | ||||
|       return false; | ||||
|     } | ||||
|     const replacement = this.importProcessor.getIdentifierReplacement( | ||||
|       this.tokens.identifierNameForToken(token), | ||||
|     ); | ||||
|     if (!replacement) { | ||||
|       return false; | ||||
|     } | ||||
|     // Tolerate any number of closing parens while looking for an opening paren
 | ||||
|     // that indicates a function call.
 | ||||
|     let possibleOpenParenIndex = this.tokens.currentIndex() + 1; | ||||
|     while ( | ||||
|       possibleOpenParenIndex < this.tokens.tokens.length && | ||||
|       this.tokens.tokens[possibleOpenParenIndex].type === tt.parenR | ||||
|     ) { | ||||
|       possibleOpenParenIndex++; | ||||
|     } | ||||
|     // Avoid treating imported functions as methods of their `exports` object
 | ||||
|     // by using `(0, f)` when the identifier is in a paren expression. Else
 | ||||
|     // use `Function.prototype.call` when the identifier is a guaranteed
 | ||||
|     // function call. When using `call`, pass undefined as the context.
 | ||||
|     if (this.tokens.tokens[possibleOpenParenIndex].type === tt.parenL) { | ||||
|       if ( | ||||
|         this.tokens.tokenAtRelativeIndex(1).type === tt.parenL && | ||||
|         this.tokens.tokenAtRelativeIndex(-1).type !== tt._new | ||||
|       ) { | ||||
|         this.tokens.replaceToken(`${replacement}.call(void 0, `); | ||||
|         // Remove the old paren.
 | ||||
|         this.tokens.removeToken(); | ||||
|         // Balance out the new paren.
 | ||||
|         this.rootTransformer.processBalancedCode(); | ||||
|         this.tokens.copyExpectedToken(tt.parenR); | ||||
|       } else { | ||||
|         // See here: http://2ality.com/2015/12/references.html
 | ||||
|         this.tokens.replaceToken(`(0, ${replacement})`); | ||||
|       } | ||||
|     } else { | ||||
|       this.tokens.replaceToken(replacement); | ||||
|     } | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   processObjectShorthand() { | ||||
|     const identifier = this.tokens.identifierName(); | ||||
|     const replacement = this.importProcessor.getIdentifierReplacement(identifier); | ||||
|     if (!replacement) { | ||||
|       return false; | ||||
|     } | ||||
|     this.tokens.replaceToken(`${identifier}: ${replacement}`); | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   processExport() { | ||||
|     if ( | ||||
|       this.tokens.matches2(tt._export, tt._enum) || | ||||
|       this.tokens.matches3(tt._export, tt._const, tt._enum) | ||||
|     ) { | ||||
|       this.hadNamedExport = true; | ||||
|       // Let the TypeScript transform handle it.
 | ||||
|       return false; | ||||
|     } | ||||
|     if (this.tokens.matches2(tt._export, tt._default)) { | ||||
|       if (this.tokens.matches3(tt._export, tt._default, tt._enum)) { | ||||
|         this.hadDefaultExport = true; | ||||
|         // Flow export default enums need some special handling, so handle them
 | ||||
|         // in that tranform rather than this one.
 | ||||
|         return false; | ||||
|       } | ||||
|       this.processExportDefault(); | ||||
|       return true; | ||||
|     } else if (this.tokens.matches2(tt._export, tt.braceL)) { | ||||
|       this.processExportBindings(); | ||||
|       return true; | ||||
|     } else if ( | ||||
|       this.tokens.matches2(tt._export, tt.name) && | ||||
|       this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 1, ContextualKeyword._type) | ||||
|     ) { | ||||
|       // export type {a};
 | ||||
|       // export type {a as b};
 | ||||
|       // export type {a} from './b';
 | ||||
|       // export type * from './b';
 | ||||
|       // export type * as ns from './b';
 | ||||
|       this.tokens.removeInitialToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       if (this.tokens.matches1(tt.braceL)) { | ||||
|         while (!this.tokens.matches1(tt.braceR)) { | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|         this.tokens.removeToken(); | ||||
|       } else { | ||||
|         // *
 | ||||
|         this.tokens.removeToken(); | ||||
|         if (this.tokens.matches1(tt._as)) { | ||||
|           // as
 | ||||
|           this.tokens.removeToken(); | ||||
|           // ns
 | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|       } | ||||
|       // Remove type re-export `... } from './T'`
 | ||||
|       if ( | ||||
|         this.tokens.matchesContextual(ContextualKeyword._from) && | ||||
|         this.tokens.matches1AtIndex(this.tokens.currentIndex() + 1, tt.string) | ||||
|       ) { | ||||
|         this.tokens.removeToken(); | ||||
|         this.tokens.removeToken(); | ||||
|         removeMaybeImportAttributes(this.tokens); | ||||
|       } | ||||
|       return true; | ||||
|     } | ||||
|     this.hadNamedExport = true; | ||||
|     if ( | ||||
|       this.tokens.matches2(tt._export, tt._var) || | ||||
|       this.tokens.matches2(tt._export, tt._let) || | ||||
|       this.tokens.matches2(tt._export, tt._const) | ||||
|     ) { | ||||
|       this.processExportVar(); | ||||
|       return true; | ||||
|     } else if ( | ||||
|       this.tokens.matches2(tt._export, tt._function) || | ||||
|       // export async function
 | ||||
|       this.tokens.matches3(tt._export, tt.name, tt._function) | ||||
|     ) { | ||||
|       this.processExportFunction(); | ||||
|       return true; | ||||
|     } else if ( | ||||
|       this.tokens.matches2(tt._export, tt._class) || | ||||
|       this.tokens.matches3(tt._export, tt._abstract, tt._class) || | ||||
|       this.tokens.matches2(tt._export, tt.at) | ||||
|     ) { | ||||
|       this.processExportClass(); | ||||
|       return true; | ||||
|     } else if (this.tokens.matches2(tt._export, tt.star)) { | ||||
|       this.processExportStar(); | ||||
|       return true; | ||||
|     } else { | ||||
|       throw new Error("Unrecognized export syntax."); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    processAssignment() { | ||||
|     const index = this.tokens.currentIndex(); | ||||
|     const identifierToken = this.tokens.tokens[index - 1]; | ||||
|     // If the LHS is a type identifier, this must be a declaration like `let a: b = c;`,
 | ||||
|     // with `b` as the identifier, so nothing needs to be done in that case.
 | ||||
|     if (identifierToken.isType || identifierToken.type !== tt.name) { | ||||
|       return false; | ||||
|     } | ||||
|     if (identifierToken.shadowsGlobal) { | ||||
|       return false; | ||||
|     } | ||||
|     if (index >= 2 && this.tokens.matches1AtIndex(index - 2, tt.dot)) { | ||||
|       return false; | ||||
|     } | ||||
|     if (index >= 2 && [tt._var, tt._let, tt._const].includes(this.tokens.tokens[index - 2].type)) { | ||||
|       // Declarations don't need an extra assignment. This doesn't avoid the
 | ||||
|       // assignment for comma-separated declarations, but it's still correct
 | ||||
|       // since the assignment is just redundant.
 | ||||
|       return false; | ||||
|     } | ||||
|     const assignmentSnippet = this.importProcessor.resolveExportBinding( | ||||
|       this.tokens.identifierNameForToken(identifierToken), | ||||
|     ); | ||||
|     if (!assignmentSnippet) { | ||||
|       return false; | ||||
|     } | ||||
|     this.tokens.copyToken(); | ||||
|     this.tokens.appendCode(` ${assignmentSnippet} =`); | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Process something like `a += 3`, where `a` might be an exported value. | ||||
|    */ | ||||
|    processComplexAssignment() { | ||||
|     const index = this.tokens.currentIndex(); | ||||
|     const identifierToken = this.tokens.tokens[index - 1]; | ||||
|     if (identifierToken.type !== tt.name) { | ||||
|       return false; | ||||
|     } | ||||
|     if (identifierToken.shadowsGlobal) { | ||||
|       return false; | ||||
|     } | ||||
|     if (index >= 2 && this.tokens.matches1AtIndex(index - 2, tt.dot)) { | ||||
|       return false; | ||||
|     } | ||||
|     const assignmentSnippet = this.importProcessor.resolveExportBinding( | ||||
|       this.tokens.identifierNameForToken(identifierToken), | ||||
|     ); | ||||
|     if (!assignmentSnippet) { | ||||
|       return false; | ||||
|     } | ||||
|     this.tokens.appendCode(` = ${assignmentSnippet}`); | ||||
|     this.tokens.copyToken(); | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Process something like `++a`, where `a` might be an exported value. | ||||
|    */ | ||||
|    processPreIncDec() { | ||||
|     const index = this.tokens.currentIndex(); | ||||
|     const identifierToken = this.tokens.tokens[index + 1]; | ||||
|     if (identifierToken.type !== tt.name) { | ||||
|       return false; | ||||
|     } | ||||
|     if (identifierToken.shadowsGlobal) { | ||||
|       return false; | ||||
|     } | ||||
|     // Ignore things like ++a.b and ++a[b] and ++a().b.
 | ||||
|     if ( | ||||
|       index + 2 < this.tokens.tokens.length && | ||||
|       (this.tokens.matches1AtIndex(index + 2, tt.dot) || | ||||
|         this.tokens.matches1AtIndex(index + 2, tt.bracketL) || | ||||
|         this.tokens.matches1AtIndex(index + 2, tt.parenL)) | ||||
|     ) { | ||||
|       return false; | ||||
|     } | ||||
|     const identifierName = this.tokens.identifierNameForToken(identifierToken); | ||||
|     const assignmentSnippet = this.importProcessor.resolveExportBinding(identifierName); | ||||
|     if (!assignmentSnippet) { | ||||
|       return false; | ||||
|     } | ||||
|     this.tokens.appendCode(`${assignmentSnippet} = `); | ||||
|     this.tokens.copyToken(); | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Process something like `a++`, where `a` might be an exported value. | ||||
|    * This starts at the `a`, not at the `++`. | ||||
|    */ | ||||
|    processPostIncDec() { | ||||
|     const index = this.tokens.currentIndex(); | ||||
|     const identifierToken = this.tokens.tokens[index]; | ||||
|     const operatorToken = this.tokens.tokens[index + 1]; | ||||
|     if (identifierToken.type !== tt.name) { | ||||
|       return false; | ||||
|     } | ||||
|     if (identifierToken.shadowsGlobal) { | ||||
|       return false; | ||||
|     } | ||||
|     if (index >= 1 && this.tokens.matches1AtIndex(index - 1, tt.dot)) { | ||||
|       return false; | ||||
|     } | ||||
|     const identifierName = this.tokens.identifierNameForToken(identifierToken); | ||||
|     const assignmentSnippet = this.importProcessor.resolveExportBinding(identifierName); | ||||
|     if (!assignmentSnippet) { | ||||
|       return false; | ||||
|     } | ||||
|     const operatorCode = this.tokens.rawCodeForToken(operatorToken); | ||||
|     // We might also replace the identifier with something like exports.x, so
 | ||||
|     // do that replacement here as well.
 | ||||
|     const base = this.importProcessor.getIdentifierReplacement(identifierName) || identifierName; | ||||
|     if (operatorCode === "++") { | ||||
|       this.tokens.replaceToken(`(${base} = ${assignmentSnippet} = ${base} + 1, ${base} - 1)`); | ||||
|     } else if (operatorCode === "--") { | ||||
|       this.tokens.replaceToken(`(${base} = ${assignmentSnippet} = ${base} - 1, ${base} + 1)`); | ||||
|     } else { | ||||
|       throw new Error(`Unexpected operator: ${operatorCode}`); | ||||
|     } | ||||
|     this.tokens.removeToken(); | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|    processExportDefault() { | ||||
|     let exportedRuntimeValue = true; | ||||
|     if ( | ||||
|       this.tokens.matches4(tt._export, tt._default, tt._function, tt.name) || | ||||
|       // export default async function
 | ||||
|       (this.tokens.matches5(tt._export, tt._default, tt.name, tt._function, tt.name) && | ||||
|         this.tokens.matchesContextualAtIndex( | ||||
|           this.tokens.currentIndex() + 2, | ||||
|           ContextualKeyword._async, | ||||
|         )) | ||||
|     ) { | ||||
|       this.tokens.removeInitialToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       // Named function export case: change it to a top-level function
 | ||||
|       // declaration followed by exports statement.
 | ||||
|       const name = this.processNamedFunction(); | ||||
|       this.tokens.appendCode(` exports.default = ${name};`); | ||||
|     } else if ( | ||||
|       this.tokens.matches4(tt._export, tt._default, tt._class, tt.name) || | ||||
|       this.tokens.matches5(tt._export, tt._default, tt._abstract, tt._class, tt.name) || | ||||
|       this.tokens.matches3(tt._export, tt._default, tt.at) | ||||
|     ) { | ||||
|       this.tokens.removeInitialToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       this.copyDecorators(); | ||||
|       if (this.tokens.matches1(tt._abstract)) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|       const name = this.rootTransformer.processNamedClass(); | ||||
|       this.tokens.appendCode(` exports.default = ${name};`); | ||||
|       // After this point, this is a plain "export default E" statement.
 | ||||
|     } else if ( | ||||
|       shouldElideDefaultExport( | ||||
|         this.isTypeScriptTransformEnabled, | ||||
|         this.keepUnusedImports, | ||||
|         this.tokens, | ||||
|         this.declarationInfo, | ||||
|       ) | ||||
|     ) { | ||||
|       // If the exported value is just an identifier and should be elided by TypeScript
 | ||||
|       // rules, then remove it entirely. It will always have the form `export default e`,
 | ||||
|       // where `e` is an identifier.
 | ||||
|       exportedRuntimeValue = false; | ||||
|       this.tokens.removeInitialToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       this.tokens.removeToken(); | ||||
|     } else if (this.reactHotLoaderTransformer) { | ||||
|       // We need to assign E to a variable. Change "export default E" to
 | ||||
|       // "let _default; exports.default = _default = E"
 | ||||
|       const defaultVarName = this.nameManager.claimFreeName("_default"); | ||||
|       this.tokens.replaceToken(`let ${defaultVarName}; exports.`); | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.appendCode(` = ${defaultVarName} =`); | ||||
|       this.reactHotLoaderTransformer.setExtractedDefaultExportName(defaultVarName); | ||||
|     } else { | ||||
|       // Change "export default E" to "exports.default = E"
 | ||||
|       this.tokens.replaceToken("exports."); | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.appendCode(" ="); | ||||
|     } | ||||
|     if (exportedRuntimeValue) { | ||||
|       this.hadDefaultExport = true; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    copyDecorators() { | ||||
|     while (this.tokens.matches1(tt.at)) { | ||||
|       this.tokens.copyToken(); | ||||
|       if (this.tokens.matches1(tt.parenL)) { | ||||
|         this.tokens.copyExpectedToken(tt.parenL); | ||||
|         this.rootTransformer.processBalancedCode(); | ||||
|         this.tokens.copyExpectedToken(tt.parenR); | ||||
|       } else { | ||||
|         this.tokens.copyExpectedToken(tt.name); | ||||
|         while (this.tokens.matches1(tt.dot)) { | ||||
|           this.tokens.copyExpectedToken(tt.dot); | ||||
|           this.tokens.copyExpectedToken(tt.name); | ||||
|         } | ||||
|         if (this.tokens.matches1(tt.parenL)) { | ||||
|           this.tokens.copyExpectedToken(tt.parenL); | ||||
|           this.rootTransformer.processBalancedCode(); | ||||
|           this.tokens.copyExpectedToken(tt.parenR); | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform a declaration like `export var`, `export let`, or `export const`. | ||||
|    */ | ||||
|    processExportVar() { | ||||
|     if (this.isSimpleExportVar()) { | ||||
|       this.processSimpleExportVar(); | ||||
|     } else { | ||||
|       this.processComplexExportVar(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Determine if the export is of the form: | ||||
|    * export var/let/const [varName] = [expr]; | ||||
|    * In other words, determine if function name inference might apply. | ||||
|    */ | ||||
|    isSimpleExportVar() { | ||||
|     let tokenIndex = this.tokens.currentIndex(); | ||||
|     // export
 | ||||
|     tokenIndex++; | ||||
|     // var/let/const
 | ||||
|     tokenIndex++; | ||||
|     if (!this.tokens.matches1AtIndex(tokenIndex, tt.name)) { | ||||
|       return false; | ||||
|     } | ||||
|     tokenIndex++; | ||||
|     while (tokenIndex < this.tokens.tokens.length && this.tokens.tokens[tokenIndex].isType) { | ||||
|       tokenIndex++; | ||||
|     } | ||||
|     if (!this.tokens.matches1AtIndex(tokenIndex, tt.eq)) { | ||||
|       return false; | ||||
|     } | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform an `export var` declaration initializing a single variable. | ||||
|    * | ||||
|    * For example, this: | ||||
|    * export const f = () => {}; | ||||
|    * becomes this: | ||||
|    * const f = () => {}; exports.f = f; | ||||
|    * | ||||
|    * The variable is unused (e.g. exports.f has the true value of the export). | ||||
|    * We need to produce an assignment of this form so that the function will | ||||
|    * have an inferred name of "f", which wouldn't happen in the more general | ||||
|    * case below. | ||||
|    */ | ||||
|    processSimpleExportVar() { | ||||
|     // export
 | ||||
|     this.tokens.removeInitialToken(); | ||||
|     // var/let/const
 | ||||
|     this.tokens.copyToken(); | ||||
|     const varName = this.tokens.identifierName(); | ||||
|     // x: number  ->  x
 | ||||
|     while (!this.tokens.matches1(tt.eq)) { | ||||
|       this.rootTransformer.processToken(); | ||||
|     } | ||||
|     const endIndex = this.tokens.currentToken().rhsEndIndex; | ||||
|     if (endIndex == null) { | ||||
|       throw new Error("Expected = token with an end index."); | ||||
|     } | ||||
|     while (this.tokens.currentIndex() < endIndex) { | ||||
|       this.rootTransformer.processToken(); | ||||
|     } | ||||
|     this.tokens.appendCode(`; exports.${varName} = ${varName}`); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform normal declaration exports, including handling destructuring. | ||||
|    * For example, this: | ||||
|    * export const {x: [a = 2, b], c} = d; | ||||
|    * becomes this: | ||||
|    * ({x: [exports.a = 2, exports.b], c: exports.c} = d;) | ||||
|    */ | ||||
|    processComplexExportVar() { | ||||
|     this.tokens.removeInitialToken(); | ||||
|     this.tokens.removeToken(); | ||||
|     const needsParens = this.tokens.matches1(tt.braceL); | ||||
|     if (needsParens) { | ||||
|       this.tokens.appendCode("("); | ||||
|     } | ||||
| 
 | ||||
|     let depth = 0; | ||||
|     while (true) { | ||||
|       if ( | ||||
|         this.tokens.matches1(tt.braceL) || | ||||
|         this.tokens.matches1(tt.dollarBraceL) || | ||||
|         this.tokens.matches1(tt.bracketL) | ||||
|       ) { | ||||
|         depth++; | ||||
|         this.tokens.copyToken(); | ||||
|       } else if (this.tokens.matches1(tt.braceR) || this.tokens.matches1(tt.bracketR)) { | ||||
|         depth--; | ||||
|         this.tokens.copyToken(); | ||||
|       } else if ( | ||||
|         depth === 0 && | ||||
|         !this.tokens.matches1(tt.name) && | ||||
|         !this.tokens.currentToken().isType | ||||
|       ) { | ||||
|         break; | ||||
|       } else if (this.tokens.matches1(tt.eq)) { | ||||
|         // Default values might have assignments in the RHS that we want to ignore, so skip past
 | ||||
|         // them.
 | ||||
|         const endIndex = this.tokens.currentToken().rhsEndIndex; | ||||
|         if (endIndex == null) { | ||||
|           throw new Error("Expected = token with an end index."); | ||||
|         } | ||||
|         while (this.tokens.currentIndex() < endIndex) { | ||||
|           this.rootTransformer.processToken(); | ||||
|         } | ||||
|       } else { | ||||
|         const token = this.tokens.currentToken(); | ||||
|         if (isDeclaration(token)) { | ||||
|           const name = this.tokens.identifierName(); | ||||
|           let replacement = this.importProcessor.getIdentifierReplacement(name); | ||||
|           if (replacement === null) { | ||||
|             throw new Error(`Expected a replacement for ${name} in \`export var\` syntax.`); | ||||
|           } | ||||
|           if (isObjectShorthandDeclaration(token)) { | ||||
|             replacement = `${name}: ${replacement}`; | ||||
|           } | ||||
|           this.tokens.replaceToken(replacement); | ||||
|         } else { | ||||
|           this.rootTransformer.processToken(); | ||||
|         } | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     if (needsParens) { | ||||
|       // Seek to the end of the RHS.
 | ||||
|       const endIndex = this.tokens.currentToken().rhsEndIndex; | ||||
|       if (endIndex == null) { | ||||
|         throw new Error("Expected = token with an end index."); | ||||
|       } | ||||
|       while (this.tokens.currentIndex() < endIndex) { | ||||
|         this.rootTransformer.processToken(); | ||||
|       } | ||||
|       this.tokens.appendCode(")"); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform this: | ||||
|    * export function foo() {} | ||||
|    * into this: | ||||
|    * function foo() {} exports.foo = foo; | ||||
|    */ | ||||
|    processExportFunction() { | ||||
|     this.tokens.replaceToken(""); | ||||
|     const name = this.processNamedFunction(); | ||||
|     this.tokens.appendCode(` exports.${name} = ${name};`); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Skip past a function with a name and return that name. | ||||
|    */ | ||||
|    processNamedFunction() { | ||||
|     if (this.tokens.matches1(tt._function)) { | ||||
|       this.tokens.copyToken(); | ||||
|     } else if (this.tokens.matches2(tt.name, tt._function)) { | ||||
|       if (!this.tokens.matchesContextual(ContextualKeyword._async)) { | ||||
|         throw new Error("Expected async keyword in function export."); | ||||
|       } | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.copyToken(); | ||||
|     } | ||||
|     if (this.tokens.matches1(tt.star)) { | ||||
|       this.tokens.copyToken(); | ||||
|     } | ||||
|     if (!this.tokens.matches1(tt.name)) { | ||||
|       throw new Error("Expected identifier for exported function name."); | ||||
|     } | ||||
|     const name = this.tokens.identifierName(); | ||||
|     this.tokens.copyToken(); | ||||
|     if (this.tokens.currentToken().isType) { | ||||
|       this.tokens.removeInitialToken(); | ||||
|       while (this.tokens.currentToken().isType) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|     } | ||||
|     this.tokens.copyExpectedToken(tt.parenL); | ||||
|     this.rootTransformer.processBalancedCode(); | ||||
|     this.tokens.copyExpectedToken(tt.parenR); | ||||
|     this.rootTransformer.processPossibleTypeRange(); | ||||
|     this.tokens.copyExpectedToken(tt.braceL); | ||||
|     this.rootTransformer.processBalancedCode(); | ||||
|     this.tokens.copyExpectedToken(tt.braceR); | ||||
|     return name; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform this: | ||||
|    * export class A {} | ||||
|    * into this: | ||||
|    * class A {} exports.A = A; | ||||
|    */ | ||||
|    processExportClass() { | ||||
|     this.tokens.removeInitialToken(); | ||||
|     this.copyDecorators(); | ||||
|     if (this.tokens.matches1(tt._abstract)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|     const name = this.rootTransformer.processNamedClass(); | ||||
|     this.tokens.appendCode(` exports.${name} = ${name};`); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform this: | ||||
|    * export {a, b as c}; | ||||
|    * into this: | ||||
|    * exports.a = a; exports.c = b; | ||||
|    * | ||||
|    * OR | ||||
|    * | ||||
|    * Transform this: | ||||
|    * export {a, b as c} from './foo'; | ||||
|    * into the pre-generated Object.defineProperty code from the ImportProcessor. | ||||
|    * | ||||
|    * For the first case, if the TypeScript transform is enabled, we need to skip | ||||
|    * exports that are only defined as types. | ||||
|    */ | ||||
|    processExportBindings() { | ||||
|     this.tokens.removeInitialToken(); | ||||
|     this.tokens.removeToken(); | ||||
| 
 | ||||
|     const isReExport = isExportFrom(this.tokens); | ||||
| 
 | ||||
|     const exportStatements = []; | ||||
|     while (true) { | ||||
|       if (this.tokens.matches1(tt.braceR)) { | ||||
|         this.tokens.removeToken(); | ||||
|         break; | ||||
|       } | ||||
| 
 | ||||
|       const specifierInfo = getImportExportSpecifierInfo(this.tokens); | ||||
| 
 | ||||
|       while (this.tokens.currentIndex() < specifierInfo.endIndex) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
| 
 | ||||
|       const shouldRemoveExport = | ||||
|         specifierInfo.isType || | ||||
|         (!isReExport && this.shouldElideExportedIdentifier(specifierInfo.leftName)); | ||||
|       if (!shouldRemoveExport) { | ||||
|         const exportedName = specifierInfo.rightName; | ||||
|         if (exportedName === "default") { | ||||
|           this.hadDefaultExport = true; | ||||
|         } else { | ||||
|           this.hadNamedExport = true; | ||||
|         } | ||||
|         const localName = specifierInfo.leftName; | ||||
|         const newLocalName = this.importProcessor.getIdentifierReplacement(localName); | ||||
|         exportStatements.push(`exports.${exportedName} = ${newLocalName || localName};`); | ||||
|       } | ||||
| 
 | ||||
|       if (this.tokens.matches1(tt.braceR)) { | ||||
|         this.tokens.removeToken(); | ||||
|         break; | ||||
|       } | ||||
|       if (this.tokens.matches2(tt.comma, tt.braceR)) { | ||||
|         this.tokens.removeToken(); | ||||
|         this.tokens.removeToken(); | ||||
|         break; | ||||
|       } else if (this.tokens.matches1(tt.comma)) { | ||||
|         this.tokens.removeToken(); | ||||
|       } else { | ||||
|         throw new Error(`Unexpected token: ${JSON.stringify(this.tokens.currentToken())}`); | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matchesContextual(ContextualKeyword._from)) { | ||||
|       // This is an export...from, so throw away the normal named export code
 | ||||
|       // and use the Object.defineProperty code from ImportProcessor.
 | ||||
|       this.tokens.removeToken(); | ||||
|       const path = this.tokens.stringValue(); | ||||
|       this.tokens.replaceTokenTrimmingLeftWhitespace(this.importProcessor.claimImportCode(path)); | ||||
|       removeMaybeImportAttributes(this.tokens); | ||||
|     } else { | ||||
|       // This is a normal named export, so use that.
 | ||||
|       this.tokens.appendCode(exportStatements.join(" ")); | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1(tt.semi)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    processExportStar() { | ||||
|     this.tokens.removeInitialToken(); | ||||
|     while (!this.tokens.matches1(tt.string)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|     const path = this.tokens.stringValue(); | ||||
|     this.tokens.replaceTokenTrimmingLeftWhitespace(this.importProcessor.claimImportCode(path)); | ||||
|     removeMaybeImportAttributes(this.tokens); | ||||
|     if (this.tokens.matches1(tt.semi)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    shouldElideExportedIdentifier(name) { | ||||
|     return ( | ||||
|       this.isTypeScriptTransformEnabled && | ||||
|       !this.keepUnusedImports && | ||||
|       !this.declarationInfo.valueDeclarations.has(name) | ||||
|     ); | ||||
|   } | ||||
| } | ||||
							
								
								
									
										415
									
								
								node_modules/sucrase/dist/esm/transformers/ESMImportTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										415
									
								
								node_modules/sucrase/dist/esm/transformers/ESMImportTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,415 @@ | |||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| import {ContextualKeyword} from "../parser/tokenizer/keywords"; | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| import elideImportEquals from "../util/elideImportEquals"; | ||||
| import getDeclarationInfo, { | ||||
| 
 | ||||
|   EMPTY_DECLARATION_INFO, | ||||
| } from "../util/getDeclarationInfo"; | ||||
| import getImportExportSpecifierInfo from "../util/getImportExportSpecifierInfo"; | ||||
| import {getNonTypeIdentifiers} from "../util/getNonTypeIdentifiers"; | ||||
| import isExportFrom from "../util/isExportFrom"; | ||||
| import {removeMaybeImportAttributes} from "../util/removeMaybeImportAttributes"; | ||||
| import shouldElideDefaultExport from "../util/shouldElideDefaultExport"; | ||||
| 
 | ||||
| import Transformer from "./Transformer"; | ||||
| 
 | ||||
| /** | ||||
|  * Class for editing import statements when we are keeping the code as ESM. We still need to remove | ||||
|  * type-only imports in TypeScript and Flow. | ||||
|  */ | ||||
| export default class ESMImportTransformer extends Transformer { | ||||
|    | ||||
|    | ||||
|    | ||||
| 
 | ||||
|   constructor( | ||||
|      tokens, | ||||
|      nameManager, | ||||
|      helperManager, | ||||
|      reactHotLoaderTransformer, | ||||
|      isTypeScriptTransformEnabled, | ||||
|      isFlowTransformEnabled, | ||||
|      keepUnusedImports, | ||||
|     options, | ||||
|   ) { | ||||
|     super();this.tokens = tokens;this.nameManager = nameManager;this.helperManager = helperManager;this.reactHotLoaderTransformer = reactHotLoaderTransformer;this.isTypeScriptTransformEnabled = isTypeScriptTransformEnabled;this.isFlowTransformEnabled = isFlowTransformEnabled;this.keepUnusedImports = keepUnusedImports;; | ||||
|     this.nonTypeIdentifiers = | ||||
|       isTypeScriptTransformEnabled && !keepUnusedImports | ||||
|         ? getNonTypeIdentifiers(tokens, options) | ||||
|         : new Set(); | ||||
|     this.declarationInfo = | ||||
|       isTypeScriptTransformEnabled && !keepUnusedImports | ||||
|         ? getDeclarationInfo(tokens) | ||||
|         : EMPTY_DECLARATION_INFO; | ||||
|     this.injectCreateRequireForImportRequire = Boolean(options.injectCreateRequireForImportRequire); | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     // TypeScript `import foo = require('foo');` should always just be translated to plain require.
 | ||||
|     if (this.tokens.matches3(tt._import, tt.name, tt.eq)) { | ||||
|       return this.processImportEquals(); | ||||
|     } | ||||
|     if ( | ||||
|       this.tokens.matches4(tt._import, tt.name, tt.name, tt.eq) && | ||||
|       this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 1, ContextualKeyword._type) | ||||
|     ) { | ||||
|       // import type T = require('T')
 | ||||
|       this.tokens.removeInitialToken(); | ||||
|       // This construct is always exactly 8 tokens long, so remove the 7 remaining tokens.
 | ||||
|       for (let i = 0; i < 7; i++) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches2(tt._export, tt.eq)) { | ||||
|       this.tokens.replaceToken("module.exports"); | ||||
|       return true; | ||||
|     } | ||||
|     if ( | ||||
|       this.tokens.matches5(tt._export, tt._import, tt.name, tt.name, tt.eq) && | ||||
|       this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 2, ContextualKeyword._type) | ||||
|     ) { | ||||
|       // export import type T = require('T')
 | ||||
|       this.tokens.removeInitialToken(); | ||||
|       // This construct is always exactly 9 tokens long, so remove the 8 remaining tokens.
 | ||||
|       for (let i = 0; i < 8; i++) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches1(tt._import)) { | ||||
|       return this.processImport(); | ||||
|     } | ||||
|     if (this.tokens.matches2(tt._export, tt._default)) { | ||||
|       return this.processExportDefault(); | ||||
|     } | ||||
|     if (this.tokens.matches2(tt._export, tt.braceL)) { | ||||
|       return this.processNamedExports(); | ||||
|     } | ||||
|     if ( | ||||
|       this.tokens.matches2(tt._export, tt.name) && | ||||
|       this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 1, ContextualKeyword._type) | ||||
|     ) { | ||||
|       // export type {a};
 | ||||
|       // export type {a as b};
 | ||||
|       // export type {a} from './b';
 | ||||
|       // export type * from './b';
 | ||||
|       // export type * as ns from './b';
 | ||||
|       this.tokens.removeInitialToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       if (this.tokens.matches1(tt.braceL)) { | ||||
|         while (!this.tokens.matches1(tt.braceR)) { | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|         this.tokens.removeToken(); | ||||
|       } else { | ||||
|         // *
 | ||||
|         this.tokens.removeToken(); | ||||
|         if (this.tokens.matches1(tt._as)) { | ||||
|           // as
 | ||||
|           this.tokens.removeToken(); | ||||
|           // ns
 | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|       } | ||||
|       // Remove type re-export `... } from './T'`
 | ||||
|       if ( | ||||
|         this.tokens.matchesContextual(ContextualKeyword._from) && | ||||
|         this.tokens.matches1AtIndex(this.tokens.currentIndex() + 1, tt.string) | ||||
|       ) { | ||||
|         this.tokens.removeToken(); | ||||
|         this.tokens.removeToken(); | ||||
|         removeMaybeImportAttributes(this.tokens); | ||||
|       } | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|    processImportEquals() { | ||||
|     const importName = this.tokens.identifierNameAtIndex(this.tokens.currentIndex() + 1); | ||||
|     if (this.shouldAutomaticallyElideImportedName(importName)) { | ||||
|       // If this name is only used as a type, elide the whole import.
 | ||||
|       elideImportEquals(this.tokens); | ||||
|     } else if (this.injectCreateRequireForImportRequire) { | ||||
|       // We're using require in an environment (Node ESM) that doesn't provide
 | ||||
|       // it as a global, so generate a helper to import it.
 | ||||
|       // import -> const
 | ||||
|       this.tokens.replaceToken("const"); | ||||
|       // Foo
 | ||||
|       this.tokens.copyToken(); | ||||
|       // =
 | ||||
|       this.tokens.copyToken(); | ||||
|       // require
 | ||||
|       this.tokens.replaceToken(this.helperManager.getHelperName("require")); | ||||
|     } else { | ||||
|       // Otherwise, just switch `import` to `const`.
 | ||||
|       this.tokens.replaceToken("const"); | ||||
|     } | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|    processImport() { | ||||
|     if (this.tokens.matches2(tt._import, tt.parenL)) { | ||||
|       // Dynamic imports don't need to be transformed.
 | ||||
|       return false; | ||||
|     } | ||||
| 
 | ||||
|     const snapshot = this.tokens.snapshot(); | ||||
|     const allImportsRemoved = this.removeImportTypeBindings(); | ||||
|     if (allImportsRemoved) { | ||||
|       this.tokens.restoreToSnapshot(snapshot); | ||||
|       while (!this.tokens.matches1(tt.string)) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|       this.tokens.removeToken(); | ||||
|       removeMaybeImportAttributes(this.tokens); | ||||
|       if (this.tokens.matches1(tt.semi)) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|     } | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Remove type bindings from this import, leaving the rest of the import intact. | ||||
|    * | ||||
|    * Return true if this import was ONLY types, and thus is eligible for removal. This will bail out | ||||
|    * of the replacement operation, so we can return early here. | ||||
|    */ | ||||
|    removeImportTypeBindings() { | ||||
|     this.tokens.copyExpectedToken(tt._import); | ||||
|     if ( | ||||
|       this.tokens.matchesContextual(ContextualKeyword._type) && | ||||
|       !this.tokens.matches1AtIndex(this.tokens.currentIndex() + 1, tt.comma) && | ||||
|       !this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 1, ContextualKeyword._from) | ||||
|     ) { | ||||
|       // This is an "import type" statement, so exit early.
 | ||||
|       return true; | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1(tt.string)) { | ||||
|       // This is a bare import, so we should proceed with the import.
 | ||||
|       this.tokens.copyToken(); | ||||
|       return false; | ||||
|     } | ||||
| 
 | ||||
|     // Skip the "module" token in import reflection.
 | ||||
|     if ( | ||||
|       this.tokens.matchesContextual(ContextualKeyword._module) && | ||||
|       this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 2, ContextualKeyword._from) | ||||
|     ) { | ||||
|       this.tokens.copyToken(); | ||||
|     } | ||||
| 
 | ||||
|     let foundNonTypeImport = false; | ||||
|     let foundAnyNamedImport = false; | ||||
|     let needsComma = false; | ||||
| 
 | ||||
|     // Handle default import.
 | ||||
|     if (this.tokens.matches1(tt.name)) { | ||||
|       if (this.shouldAutomaticallyElideImportedName(this.tokens.identifierName())) { | ||||
|         this.tokens.removeToken(); | ||||
|         if (this.tokens.matches1(tt.comma)) { | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|       } else { | ||||
|         foundNonTypeImport = true; | ||||
|         this.tokens.copyToken(); | ||||
|         if (this.tokens.matches1(tt.comma)) { | ||||
|           // We're in a statement like:
 | ||||
|           // import A, * as B from './A';
 | ||||
|           // or
 | ||||
|           // import A, {foo} from './A';
 | ||||
|           // where the `A` is being kept. The comma should be removed if an only
 | ||||
|           // if the next part of the import statement is elided, but that's hard
 | ||||
|           // to determine at this point in the code. Instead, always remove it
 | ||||
|           // and set a flag to add it back if necessary.
 | ||||
|           needsComma = true; | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1(tt.star)) { | ||||
|       if (this.shouldAutomaticallyElideImportedName(this.tokens.identifierNameAtRelativeIndex(2))) { | ||||
|         this.tokens.removeToken(); | ||||
|         this.tokens.removeToken(); | ||||
|         this.tokens.removeToken(); | ||||
|       } else { | ||||
|         if (needsComma) { | ||||
|           this.tokens.appendCode(","); | ||||
|         } | ||||
|         foundNonTypeImport = true; | ||||
|         this.tokens.copyExpectedToken(tt.star); | ||||
|         this.tokens.copyExpectedToken(tt.name); | ||||
|         this.tokens.copyExpectedToken(tt.name); | ||||
|       } | ||||
|     } else if (this.tokens.matches1(tt.braceL)) { | ||||
|       if (needsComma) { | ||||
|         this.tokens.appendCode(","); | ||||
|       } | ||||
|       this.tokens.copyToken(); | ||||
|       while (!this.tokens.matches1(tt.braceR)) { | ||||
|         foundAnyNamedImport = true; | ||||
|         const specifierInfo = getImportExportSpecifierInfo(this.tokens); | ||||
|         if ( | ||||
|           specifierInfo.isType || | ||||
|           this.shouldAutomaticallyElideImportedName(specifierInfo.rightName) | ||||
|         ) { | ||||
|           while (this.tokens.currentIndex() < specifierInfo.endIndex) { | ||||
|             this.tokens.removeToken(); | ||||
|           } | ||||
|           if (this.tokens.matches1(tt.comma)) { | ||||
|             this.tokens.removeToken(); | ||||
|           } | ||||
|         } else { | ||||
|           foundNonTypeImport = true; | ||||
|           while (this.tokens.currentIndex() < specifierInfo.endIndex) { | ||||
|             this.tokens.copyToken(); | ||||
|           } | ||||
|           if (this.tokens.matches1(tt.comma)) { | ||||
|             this.tokens.copyToken(); | ||||
|           } | ||||
|         } | ||||
|       } | ||||
|       this.tokens.copyExpectedToken(tt.braceR); | ||||
|     } | ||||
| 
 | ||||
|     if (this.keepUnusedImports) { | ||||
|       return false; | ||||
|     } | ||||
|     if (this.isTypeScriptTransformEnabled) { | ||||
|       return !foundNonTypeImport; | ||||
|     } else if (this.isFlowTransformEnabled) { | ||||
|       // In Flow, unlike TS, `import {} from 'foo';` preserves the import.
 | ||||
|       return foundAnyNamedImport && !foundNonTypeImport; | ||||
|     } else { | ||||
|       return false; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    shouldAutomaticallyElideImportedName(name) { | ||||
|     return ( | ||||
|       this.isTypeScriptTransformEnabled && | ||||
|       !this.keepUnusedImports && | ||||
|       !this.nonTypeIdentifiers.has(name) | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|    processExportDefault() { | ||||
|     if ( | ||||
|       shouldElideDefaultExport( | ||||
|         this.isTypeScriptTransformEnabled, | ||||
|         this.keepUnusedImports, | ||||
|         this.tokens, | ||||
|         this.declarationInfo, | ||||
|       ) | ||||
|     ) { | ||||
|       // If the exported value is just an identifier and should be elided by TypeScript
 | ||||
|       // rules, then remove it entirely. It will always have the form `export default e`,
 | ||||
|       // where `e` is an identifier.
 | ||||
|       this.tokens.removeInitialToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       return true; | ||||
|     } | ||||
| 
 | ||||
|     const alreadyHasName = | ||||
|       this.tokens.matches4(tt._export, tt._default, tt._function, tt.name) || | ||||
|       // export default async function
 | ||||
|       (this.tokens.matches5(tt._export, tt._default, tt.name, tt._function, tt.name) && | ||||
|         this.tokens.matchesContextualAtIndex( | ||||
|           this.tokens.currentIndex() + 2, | ||||
|           ContextualKeyword._async, | ||||
|         )) || | ||||
|       this.tokens.matches4(tt._export, tt._default, tt._class, tt.name) || | ||||
|       this.tokens.matches5(tt._export, tt._default, tt._abstract, tt._class, tt.name); | ||||
| 
 | ||||
|     if (!alreadyHasName && this.reactHotLoaderTransformer) { | ||||
|       // This is a plain "export default E" statement and we need to assign E to a variable.
 | ||||
|       // Change "export default E" to "let _default; export default _default = E"
 | ||||
|       const defaultVarName = this.nameManager.claimFreeName("_default"); | ||||
|       this.tokens.replaceToken(`let ${defaultVarName}; export`); | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.appendCode(` ${defaultVarName} =`); | ||||
|       this.reactHotLoaderTransformer.setExtractedDefaultExportName(defaultVarName); | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Handle a statement with one of these forms: | ||||
|    * export {a, type b}; | ||||
|    * export {c, type d} from 'foo'; | ||||
|    * | ||||
|    * In both cases, any explicit type exports should be removed. In the first | ||||
|    * case, we also need to handle implicit export elision for names declared as | ||||
|    * types. In the second case, we must NOT do implicit named export elision, | ||||
|    * but we must remove the runtime import if all exports are type exports. | ||||
|    */ | ||||
|    processNamedExports() { | ||||
|     if (!this.isTypeScriptTransformEnabled) { | ||||
|       return false; | ||||
|     } | ||||
|     this.tokens.copyExpectedToken(tt._export); | ||||
|     this.tokens.copyExpectedToken(tt.braceL); | ||||
| 
 | ||||
|     const isReExport = isExportFrom(this.tokens); | ||||
|     let foundNonTypeExport = false; | ||||
|     while (!this.tokens.matches1(tt.braceR)) { | ||||
|       const specifierInfo = getImportExportSpecifierInfo(this.tokens); | ||||
|       if ( | ||||
|         specifierInfo.isType || | ||||
|         (!isReExport && this.shouldElideExportedName(specifierInfo.leftName)) | ||||
|       ) { | ||||
|         // Type export, so remove all tokens, including any comma.
 | ||||
|         while (this.tokens.currentIndex() < specifierInfo.endIndex) { | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|         if (this.tokens.matches1(tt.comma)) { | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|       } else { | ||||
|         // Non-type export, so copy all tokens, including any comma.
 | ||||
|         foundNonTypeExport = true; | ||||
|         while (this.tokens.currentIndex() < specifierInfo.endIndex) { | ||||
|           this.tokens.copyToken(); | ||||
|         } | ||||
|         if (this.tokens.matches1(tt.comma)) { | ||||
|           this.tokens.copyToken(); | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|     this.tokens.copyExpectedToken(tt.braceR); | ||||
| 
 | ||||
|     if (!this.keepUnusedImports && isReExport && !foundNonTypeExport) { | ||||
|       // This is a type-only re-export, so skip evaluating the other module. Technically this
 | ||||
|       // leaves the statement as `export {}`, but that's ok since that's a no-op.
 | ||||
|       this.tokens.removeToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       removeMaybeImportAttributes(this.tokens); | ||||
|     } | ||||
| 
 | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * ESM elides all imports with the rule that we only elide if we see that it's | ||||
|    * a type and never see it as a value. This is in contrast to CJS, which | ||||
|    * elides imports that are completely unknown. | ||||
|    */ | ||||
|    shouldElideExportedName(name) { | ||||
|     return ( | ||||
|       this.isTypeScriptTransformEnabled && | ||||
|       !this.keepUnusedImports && | ||||
|       this.declarationInfo.typeDeclarations.has(name) && | ||||
|       !this.declarationInfo.valueDeclarations.has(name) | ||||
|     ); | ||||
|   } | ||||
| } | ||||
							
								
								
									
										182
									
								
								node_modules/sucrase/dist/esm/transformers/FlowTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										182
									
								
								node_modules/sucrase/dist/esm/transformers/FlowTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,182 @@ | |||
| import {ContextualKeyword} from "../parser/tokenizer/keywords"; | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| 
 | ||||
| import Transformer from "./Transformer"; | ||||
| 
 | ||||
| export default class FlowTransformer extends Transformer { | ||||
|   constructor( | ||||
|      rootTransformer, | ||||
|      tokens, | ||||
|      isImportsTransformEnabled, | ||||
|   ) { | ||||
|     super();this.rootTransformer = rootTransformer;this.tokens = tokens;this.isImportsTransformEnabled = isImportsTransformEnabled;; | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     if ( | ||||
|       this.rootTransformer.processPossibleArrowParamEnd() || | ||||
|       this.rootTransformer.processPossibleAsyncArrowWithTypeParams() || | ||||
|       this.rootTransformer.processPossibleTypeRange() | ||||
|     ) { | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches1(tt._enum)) { | ||||
|       this.processEnum(); | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches2(tt._export, tt._enum)) { | ||||
|       this.processNamedExportEnum(); | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches3(tt._export, tt._default, tt._enum)) { | ||||
|       this.processDefaultExportEnum(); | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Handle a declaration like: | ||||
|    * export enum E ... | ||||
|    * | ||||
|    * With this imports transform, this becomes: | ||||
|    * const E = [[enum]]; exports.E = E; | ||||
|    * | ||||
|    * otherwise, it becomes: | ||||
|    * export const E = [[enum]]; | ||||
|    */ | ||||
|   processNamedExportEnum() { | ||||
|     if (this.isImportsTransformEnabled) { | ||||
|       // export
 | ||||
|       this.tokens.removeInitialToken(); | ||||
|       const enumName = this.tokens.identifierNameAtRelativeIndex(1); | ||||
|       this.processEnum(); | ||||
|       this.tokens.appendCode(` exports.${enumName} = ${enumName};`); | ||||
|     } else { | ||||
|       this.tokens.copyToken(); | ||||
|       this.processEnum(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Handle a declaration like: | ||||
|    * export default enum E | ||||
|    * | ||||
|    * With the imports transform, this becomes: | ||||
|    * const E = [[enum]]; exports.default = E; | ||||
|    * | ||||
|    * otherwise, it becomes: | ||||
|    * const E = [[enum]]; export default E; | ||||
|    */ | ||||
|   processDefaultExportEnum() { | ||||
|     // export
 | ||||
|     this.tokens.removeInitialToken(); | ||||
|     // default
 | ||||
|     this.tokens.removeToken(); | ||||
|     const enumName = this.tokens.identifierNameAtRelativeIndex(1); | ||||
|     this.processEnum(); | ||||
|     if (this.isImportsTransformEnabled) { | ||||
|       this.tokens.appendCode(` exports.default = ${enumName};`); | ||||
|     } else { | ||||
|       this.tokens.appendCode(` export default ${enumName};`); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transpile flow enums to invoke the "flow-enums-runtime" library. | ||||
|    * | ||||
|    * Currently, the transpiled code always uses `require("flow-enums-runtime")`, | ||||
|    * but if future flexibility is needed, we could expose a config option for | ||||
|    * this string (similar to configurable JSX). Even when targeting ESM, the | ||||
|    * default behavior of babel-plugin-transform-flow-enums is to use require | ||||
|    * rather than injecting an import. | ||||
|    * | ||||
|    * Flow enums are quite a bit simpler than TS enums and have some convenient | ||||
|    * constraints: | ||||
|    * - Element initializers must be either always present or always absent. That | ||||
|    *   means that we can use fixed lookahead on the first element (if any) and | ||||
|    *   assume that all elements are like that. | ||||
|    * - The right-hand side of an element initializer must be a literal value, | ||||
|    *   not a complex expression and not referencing other elements. That means | ||||
|    *   we can simply copy a single token. | ||||
|    * | ||||
|    * Enums can be broken up into three basic cases: | ||||
|    * | ||||
|    * Mirrored enums: | ||||
|    * enum E {A, B} | ||||
|    *   -> | ||||
|    * const E = require("flow-enums-runtime").Mirrored(["A", "B"]); | ||||
|    * | ||||
|    * Initializer enums: | ||||
|    * enum E {A = 1, B = 2} | ||||
|    *   -> | ||||
|    * const E = require("flow-enums-runtime")({A: 1, B: 2}); | ||||
|    * | ||||
|    * Symbol enums: | ||||
|    * enum E of symbol {A, B} | ||||
|    *   -> | ||||
|    * const E = require("flow-enums-runtime")({A: Symbol("A"), B: Symbol("B")}); | ||||
|    * | ||||
|    * We can statically detect which of the three cases this is by looking at the | ||||
|    * "of" declaration (if any) and seeing if the first element has an initializer. | ||||
|    * Since the other transform details are so similar between the three cases, we | ||||
|    * use a single implementation and vary the transform within processEnumElement | ||||
|    * based on case. | ||||
|    */ | ||||
|   processEnum() { | ||||
|     // enum E -> const E
 | ||||
|     this.tokens.replaceToken("const"); | ||||
|     this.tokens.copyExpectedToken(tt.name); | ||||
| 
 | ||||
|     let isSymbolEnum = false; | ||||
|     if (this.tokens.matchesContextual(ContextualKeyword._of)) { | ||||
|       this.tokens.removeToken(); | ||||
|       isSymbolEnum = this.tokens.matchesContextual(ContextualKeyword._symbol); | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|     const hasInitializers = this.tokens.matches3(tt.braceL, tt.name, tt.eq); | ||||
|     this.tokens.appendCode(' = require("flow-enums-runtime")'); | ||||
| 
 | ||||
|     const isMirrored = !isSymbolEnum && !hasInitializers; | ||||
|     this.tokens.replaceTokenTrimmingLeftWhitespace(isMirrored ? ".Mirrored([" : "({"); | ||||
| 
 | ||||
|     while (!this.tokens.matches1(tt.braceR)) { | ||||
|       // ... is allowed at the end and has no runtime behavior.
 | ||||
|       if (this.tokens.matches1(tt.ellipsis)) { | ||||
|         this.tokens.removeToken(); | ||||
|         break; | ||||
|       } | ||||
|       this.processEnumElement(isSymbolEnum, hasInitializers); | ||||
|       if (this.tokens.matches1(tt.comma)) { | ||||
|         this.tokens.copyToken(); | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     this.tokens.replaceToken(isMirrored ? "]);" : "});"); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Process an individual enum element, producing either an array element or an | ||||
|    * object element based on what type of enum this is. | ||||
|    */ | ||||
|   processEnumElement(isSymbolEnum, hasInitializers) { | ||||
|     if (isSymbolEnum) { | ||||
|       // Symbol enums never have initializers and are expanded to object elements.
 | ||||
|       // A, -> A: Symbol("A"),
 | ||||
|       const elementName = this.tokens.identifierName(); | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.appendCode(`: Symbol("${elementName}")`); | ||||
|     } else if (hasInitializers) { | ||||
|       // Initializers are expanded to object elements.
 | ||||
|       // A = 1, -> A: 1,
 | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.replaceTokenTrimmingLeftWhitespace(":"); | ||||
|       this.tokens.copyToken(); | ||||
|     } else { | ||||
|       // Enum elements without initializers become string literal array elements.
 | ||||
|       // A, -> "A",
 | ||||
|       this.tokens.replaceToken(`"${this.tokens.identifierName()}"`); | ||||
|     } | ||||
|   } | ||||
| } | ||||
							
								
								
									
										733
									
								
								node_modules/sucrase/dist/esm/transformers/JSXTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										733
									
								
								node_modules/sucrase/dist/esm/transformers/JSXTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,733 @@ | |||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| import XHTMLEntities from "../parser/plugins/jsx/xhtml"; | ||||
| import {JSXRole} from "../parser/tokenizer"; | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| import {charCodes} from "../parser/util/charcodes"; | ||||
| 
 | ||||
| import getJSXPragmaInfo, {} from "../util/getJSXPragmaInfo"; | ||||
| 
 | ||||
| import Transformer from "./Transformer"; | ||||
| 
 | ||||
| export default class JSXTransformer extends Transformer { | ||||
|    | ||||
|    | ||||
|    | ||||
| 
 | ||||
|   // State for calculating the line number of each JSX tag in development.
 | ||||
|   __init() {this.lastLineNumber = 1} | ||||
|   __init2() {this.lastIndex = 0} | ||||
| 
 | ||||
|   // In development, variable name holding the name of the current file.
 | ||||
|   __init3() {this.filenameVarName = null} | ||||
|   // Mapping of claimed names for imports in the automatic transform, e,g.
 | ||||
|   // {jsx: "_jsx"}. This determines which imports to generate in the prefix.
 | ||||
|   __init4() {this.esmAutomaticImportNameResolutions = {}} | ||||
|   // When automatically adding imports in CJS mode, we store the variable name
 | ||||
|   // holding the imported CJS module so we can require it in the prefix.
 | ||||
|   __init5() {this.cjsAutomaticModuleNameResolutions = {}} | ||||
| 
 | ||||
|   constructor( | ||||
|      rootTransformer, | ||||
|      tokens, | ||||
|      importProcessor, | ||||
|      nameManager, | ||||
|      options, | ||||
|   ) { | ||||
|     super();this.rootTransformer = rootTransformer;this.tokens = tokens;this.importProcessor = importProcessor;this.nameManager = nameManager;this.options = options;JSXTransformer.prototype.__init.call(this);JSXTransformer.prototype.__init2.call(this);JSXTransformer.prototype.__init3.call(this);JSXTransformer.prototype.__init4.call(this);JSXTransformer.prototype.__init5.call(this);; | ||||
|     this.jsxPragmaInfo = getJSXPragmaInfo(options); | ||||
|     this.isAutomaticRuntime = options.jsxRuntime === "automatic"; | ||||
|     this.jsxImportSource = options.jsxImportSource || "react"; | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     if (this.tokens.matches1(tt.jsxTagStart)) { | ||||
|       this.processJSXTag(); | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   getPrefixCode() { | ||||
|     let prefix = ""; | ||||
|     if (this.filenameVarName) { | ||||
|       prefix += `const ${this.filenameVarName} = ${JSON.stringify(this.options.filePath || "")};`; | ||||
|     } | ||||
|     if (this.isAutomaticRuntime) { | ||||
|       if (this.importProcessor) { | ||||
|         // CJS mode: emit require statements for all modules that were referenced.
 | ||||
|         for (const [path, resolvedName] of Object.entries(this.cjsAutomaticModuleNameResolutions)) { | ||||
|           prefix += `var ${resolvedName} = require("${path}");`; | ||||
|         } | ||||
|       } else { | ||||
|         // ESM mode: consolidate and emit import statements for referenced names.
 | ||||
|         const {createElement: createElementResolution, ...otherResolutions} = | ||||
|           this.esmAutomaticImportNameResolutions; | ||||
|         if (createElementResolution) { | ||||
|           prefix += `import {createElement as ${createElementResolution}} from "${this.jsxImportSource}";`; | ||||
|         } | ||||
|         const importSpecifiers = Object.entries(otherResolutions) | ||||
|           .map(([name, resolvedName]) => `${name} as ${resolvedName}`) | ||||
|           .join(", "); | ||||
|         if (importSpecifiers) { | ||||
|           const importPath = | ||||
|             this.jsxImportSource + (this.options.production ? "/jsx-runtime" : "/jsx-dev-runtime"); | ||||
|           prefix += `import {${importSpecifiers}} from "${importPath}";`; | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|     return prefix; | ||||
|   } | ||||
| 
 | ||||
|   processJSXTag() { | ||||
|     const {jsxRole, start} = this.tokens.currentToken(); | ||||
|     // Calculate line number information at the very start (if in development
 | ||||
|     // mode) so that the information is guaranteed to be queried in token order.
 | ||||
|     const elementLocationCode = this.options.production ? null : this.getElementLocationCode(start); | ||||
|     if (this.isAutomaticRuntime && jsxRole !== JSXRole.KeyAfterPropSpread) { | ||||
|       this.transformTagToJSXFunc(elementLocationCode, jsxRole); | ||||
|     } else { | ||||
|       this.transformTagToCreateElement(elementLocationCode); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   getElementLocationCode(firstTokenStart) { | ||||
|     const lineNumber = this.getLineNumberForIndex(firstTokenStart); | ||||
|     return `lineNumber: ${lineNumber}`; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Get the line number for this source position. This is calculated lazily and | ||||
|    * must be called in increasing order by index. | ||||
|    */ | ||||
|   getLineNumberForIndex(index) { | ||||
|     const code = this.tokens.code; | ||||
|     while (this.lastIndex < index && this.lastIndex < code.length) { | ||||
|       if (code[this.lastIndex] === "\n") { | ||||
|         this.lastLineNumber++; | ||||
|       } | ||||
|       this.lastIndex++; | ||||
|     } | ||||
|     return this.lastLineNumber; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Convert the current JSX element to a call to jsx, jsxs, or jsxDEV. This is | ||||
|    * the primary transformation for the automatic transform. | ||||
|    * | ||||
|    * Example: | ||||
|    * <div a={1} key={2}>Hello{x}</div> | ||||
|    * becomes | ||||
|    * jsxs('div', {a: 1, children: ["Hello", x]}, 2) | ||||
|    */ | ||||
|   transformTagToJSXFunc(elementLocationCode, jsxRole) { | ||||
|     const isStatic = jsxRole === JSXRole.StaticChildren; | ||||
|     // First tag is always jsxTagStart.
 | ||||
|     this.tokens.replaceToken(this.getJSXFuncInvocationCode(isStatic)); | ||||
| 
 | ||||
|     let keyCode = null; | ||||
|     if (this.tokens.matches1(tt.jsxTagEnd)) { | ||||
|       // Fragment syntax.
 | ||||
|       this.tokens.replaceToken(`${this.getFragmentCode()}, {`); | ||||
|       this.processAutomaticChildrenAndEndProps(jsxRole); | ||||
|     } else { | ||||
|       // Normal open tag or self-closing tag.
 | ||||
|       this.processTagIntro(); | ||||
|       this.tokens.appendCode(", {"); | ||||
|       keyCode = this.processProps(true); | ||||
| 
 | ||||
|       if (this.tokens.matches2(tt.slash, tt.jsxTagEnd)) { | ||||
|         // Self-closing tag, no children to add, so close the props.
 | ||||
|         this.tokens.appendCode("}"); | ||||
|       } else if (this.tokens.matches1(tt.jsxTagEnd)) { | ||||
|         // Tag with children.
 | ||||
|         this.tokens.removeToken(); | ||||
|         this.processAutomaticChildrenAndEndProps(jsxRole); | ||||
|       } else { | ||||
|         throw new Error("Expected either /> or > at the end of the tag."); | ||||
|       } | ||||
|       // If a key was present, move it to its own arg. Note that moving code
 | ||||
|       // like this will cause line numbers to get out of sync within the JSX
 | ||||
|       // element if the key expression has a newline in it. This is unfortunate,
 | ||||
|       // but hopefully should be rare.
 | ||||
|       if (keyCode) { | ||||
|         this.tokens.appendCode(`, ${keyCode}`); | ||||
|       } | ||||
|     } | ||||
|     if (!this.options.production) { | ||||
|       // If the key wasn't already added, add it now so we can correctly set
 | ||||
|       // positional args for jsxDEV.
 | ||||
|       if (keyCode === null) { | ||||
|         this.tokens.appendCode(", void 0"); | ||||
|       } | ||||
|       this.tokens.appendCode(`, ${isStatic}, ${this.getDevSource(elementLocationCode)}, this`); | ||||
|     } | ||||
|     // We're at the close-tag or the end of a self-closing tag, so remove
 | ||||
|     // everything else and close the function call.
 | ||||
|     this.tokens.removeInitialToken(); | ||||
|     while (!this.tokens.matches1(tt.jsxTagEnd)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|     this.tokens.replaceToken(")"); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Convert the current JSX element to a createElement call. In the classic | ||||
|    * runtime, this is the only case. In the automatic runtime, this is called | ||||
|    * as a fallback in some situations. | ||||
|    * | ||||
|    * Example: | ||||
|    * <div a={1} key={2}>Hello{x}</div> | ||||
|    * becomes | ||||
|    * React.createElement('div', {a: 1, key: 2}, "Hello", x) | ||||
|    */ | ||||
|   transformTagToCreateElement(elementLocationCode) { | ||||
|     // First tag is always jsxTagStart.
 | ||||
|     this.tokens.replaceToken(this.getCreateElementInvocationCode()); | ||||
| 
 | ||||
|     if (this.tokens.matches1(tt.jsxTagEnd)) { | ||||
|       // Fragment syntax.
 | ||||
|       this.tokens.replaceToken(`${this.getFragmentCode()}, null`); | ||||
|       this.processChildren(true); | ||||
|     } else { | ||||
|       // Normal open tag or self-closing tag.
 | ||||
|       this.processTagIntro(); | ||||
|       this.processPropsObjectWithDevInfo(elementLocationCode); | ||||
| 
 | ||||
|       if (this.tokens.matches2(tt.slash, tt.jsxTagEnd)) { | ||||
|         // Self-closing tag; no children to process.
 | ||||
|       } else if (this.tokens.matches1(tt.jsxTagEnd)) { | ||||
|         // Tag with children and a close-tag; process the children as args.
 | ||||
|         this.tokens.removeToken(); | ||||
|         this.processChildren(true); | ||||
|       } else { | ||||
|         throw new Error("Expected either /> or > at the end of the tag."); | ||||
|       } | ||||
|     } | ||||
|     // We're at the close-tag or the end of a self-closing tag, so remove
 | ||||
|     // everything else and close the function call.
 | ||||
|     this.tokens.removeInitialToken(); | ||||
|     while (!this.tokens.matches1(tt.jsxTagEnd)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|     this.tokens.replaceToken(")"); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Get the code for the relevant function for this context: jsx, jsxs, | ||||
|    * or jsxDEV. The following open-paren is included as well. | ||||
|    * | ||||
|    * These functions are only used for the automatic runtime, so they are always | ||||
|    * auto-imported, but the auto-import will be either CJS or ESM based on the | ||||
|    * target module format. | ||||
|    */ | ||||
|   getJSXFuncInvocationCode(isStatic) { | ||||
|     if (this.options.production) { | ||||
|       if (isStatic) { | ||||
|         return this.claimAutoImportedFuncInvocation("jsxs", "/jsx-runtime"); | ||||
|       } else { | ||||
|         return this.claimAutoImportedFuncInvocation("jsx", "/jsx-runtime"); | ||||
|       } | ||||
|     } else { | ||||
|       return this.claimAutoImportedFuncInvocation("jsxDEV", "/jsx-dev-runtime"); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Return the code to use for the createElement function, e.g. | ||||
|    * `React.createElement`, including the following open-paren. | ||||
|    * | ||||
|    * This is the main function to use for the classic runtime. For the | ||||
|    * automatic runtime, this function is used as a fallback function to | ||||
|    * preserve behavior when there is a prop spread followed by an explicit | ||||
|    * key. In that automatic runtime case, the function should be automatically | ||||
|    * imported. | ||||
|    */ | ||||
|   getCreateElementInvocationCode() { | ||||
|     if (this.isAutomaticRuntime) { | ||||
|       return this.claimAutoImportedFuncInvocation("createElement", ""); | ||||
|     } else { | ||||
|       const {jsxPragmaInfo} = this; | ||||
|       const resolvedPragmaBaseName = this.importProcessor | ||||
|         ? this.importProcessor.getIdentifierReplacement(jsxPragmaInfo.base) || jsxPragmaInfo.base | ||||
|         : jsxPragmaInfo.base; | ||||
|       return `${resolvedPragmaBaseName}${jsxPragmaInfo.suffix}(`; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Return the code to use as the component when compiling a shorthand | ||||
|    * fragment, e.g. `React.Fragment`. | ||||
|    * | ||||
|    * This may be called from either the classic or automatic runtime, and | ||||
|    * the value should be auto-imported for the automatic runtime. | ||||
|    */ | ||||
|   getFragmentCode() { | ||||
|     if (this.isAutomaticRuntime) { | ||||
|       return this.claimAutoImportedName( | ||||
|         "Fragment", | ||||
|         this.options.production ? "/jsx-runtime" : "/jsx-dev-runtime", | ||||
|       ); | ||||
|     } else { | ||||
|       const {jsxPragmaInfo} = this; | ||||
|       const resolvedFragmentPragmaBaseName = this.importProcessor | ||||
|         ? this.importProcessor.getIdentifierReplacement(jsxPragmaInfo.fragmentBase) || | ||||
|           jsxPragmaInfo.fragmentBase | ||||
|         : jsxPragmaInfo.fragmentBase; | ||||
|       return resolvedFragmentPragmaBaseName + jsxPragmaInfo.fragmentSuffix; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Return code that invokes the given function. | ||||
|    * | ||||
|    * When the imports transform is enabled, use the CJSImportTransformer | ||||
|    * strategy of using `.call(void 0, ...` to avoid passing a `this` value in a | ||||
|    * situation that would otherwise look like a method call. | ||||
|    */ | ||||
|   claimAutoImportedFuncInvocation(funcName, importPathSuffix) { | ||||
|     const funcCode = this.claimAutoImportedName(funcName, importPathSuffix); | ||||
|     if (this.importProcessor) { | ||||
|       return `${funcCode}.call(void 0, `; | ||||
|     } else { | ||||
|       return `${funcCode}(`; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   claimAutoImportedName(funcName, importPathSuffix) { | ||||
|     if (this.importProcessor) { | ||||
|       // CJS mode: claim a name for the module and mark it for import.
 | ||||
|       const path = this.jsxImportSource + importPathSuffix; | ||||
|       if (!this.cjsAutomaticModuleNameResolutions[path]) { | ||||
|         this.cjsAutomaticModuleNameResolutions[path] = | ||||
|           this.importProcessor.getFreeIdentifierForPath(path); | ||||
|       } | ||||
|       return `${this.cjsAutomaticModuleNameResolutions[path]}.${funcName}`; | ||||
|     } else { | ||||
|       // ESM mode: claim a name for this function and add it to the names that
 | ||||
|       // should be auto-imported when the prefix is generated.
 | ||||
|       if (!this.esmAutomaticImportNameResolutions[funcName]) { | ||||
|         this.esmAutomaticImportNameResolutions[funcName] = this.nameManager.claimFreeName( | ||||
|           `_${funcName}`, | ||||
|         ); | ||||
|       } | ||||
|       return this.esmAutomaticImportNameResolutions[funcName]; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Process the first part of a tag, before any props. | ||||
|    */ | ||||
|   processTagIntro() { | ||||
|     // Walk forward until we see one of these patterns:
 | ||||
|     // jsxName to start the first prop, preceded by another jsxName to end the tag name.
 | ||||
|     // jsxName to start the first prop, preceded by greaterThan to end the type argument.
 | ||||
|     // [open brace] to start the first prop.
 | ||||
|     // [jsxTagEnd] to end the open-tag.
 | ||||
|     // [slash, jsxTagEnd] to end the self-closing tag.
 | ||||
|     let introEnd = this.tokens.currentIndex() + 1; | ||||
|     while ( | ||||
|       this.tokens.tokens[introEnd].isType || | ||||
|       (!this.tokens.matches2AtIndex(introEnd - 1, tt.jsxName, tt.jsxName) && | ||||
|         !this.tokens.matches2AtIndex(introEnd - 1, tt.greaterThan, tt.jsxName) && | ||||
|         !this.tokens.matches1AtIndex(introEnd, tt.braceL) && | ||||
|         !this.tokens.matches1AtIndex(introEnd, tt.jsxTagEnd) && | ||||
|         !this.tokens.matches2AtIndex(introEnd, tt.slash, tt.jsxTagEnd)) | ||||
|     ) { | ||||
|       introEnd++; | ||||
|     } | ||||
|     if (introEnd === this.tokens.currentIndex() + 1) { | ||||
|       const tagName = this.tokens.identifierName(); | ||||
|       if (startsWithLowerCase(tagName)) { | ||||
|         this.tokens.replaceToken(`'${tagName}'`); | ||||
|       } | ||||
|     } | ||||
|     while (this.tokens.currentIndex() < introEnd) { | ||||
|       this.rootTransformer.processToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Starting at the beginning of the props, add the props argument to | ||||
|    * React.createElement, including the comma before it. | ||||
|    */ | ||||
|   processPropsObjectWithDevInfo(elementLocationCode) { | ||||
|     const devProps = this.options.production | ||||
|       ? "" | ||||
|       : `__self: this, __source: ${this.getDevSource(elementLocationCode)}`; | ||||
|     if (!this.tokens.matches1(tt.jsxName) && !this.tokens.matches1(tt.braceL)) { | ||||
|       if (devProps) { | ||||
|         this.tokens.appendCode(`, {${devProps}}`); | ||||
|       } else { | ||||
|         this.tokens.appendCode(`, null`); | ||||
|       } | ||||
|       return; | ||||
|     } | ||||
|     this.tokens.appendCode(`, {`); | ||||
|     this.processProps(false); | ||||
|     if (devProps) { | ||||
|       this.tokens.appendCode(` ${devProps}}`); | ||||
|     } else { | ||||
|       this.tokens.appendCode("}"); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform the core part of the props, assuming that a { has already been | ||||
|    * inserted before us and that a } will be inserted after us. | ||||
|    * | ||||
|    * If extractKeyCode is true (i.e. when using any jsx... function), any prop | ||||
|    * named "key" has its code captured and returned rather than being emitted to | ||||
|    * the output code. This shifts line numbers, and emitting the code later will | ||||
|    * correct line numbers again. If no key is found or if extractKeyCode is | ||||
|    * false, this function returns null. | ||||
|    */ | ||||
|   processProps(extractKeyCode) { | ||||
|     let keyCode = null; | ||||
|     while (true) { | ||||
|       if (this.tokens.matches2(tt.jsxName, tt.eq)) { | ||||
|         // This is a regular key={value} or key="value" prop.
 | ||||
|         const propName = this.tokens.identifierName(); | ||||
|         if (extractKeyCode && propName === "key") { | ||||
|           if (keyCode !== null) { | ||||
|             // The props list has multiple keys. Different implementations are
 | ||||
|             // inconsistent about what to do here: as of this writing, Babel and
 | ||||
|             // swc keep the *last* key and completely remove the rest, while
 | ||||
|             // TypeScript uses the *first* key and leaves the others as regular
 | ||||
|             // props. The React team collaborated with Babel on the
 | ||||
|             // implementation of this behavior, so presumably the Babel behavior
 | ||||
|             // is the one to use.
 | ||||
|             // Since we won't ever be emitting the previous key code, we need to
 | ||||
|             // at least emit its newlines here so that the line numbers match up
 | ||||
|             // in the long run.
 | ||||
|             this.tokens.appendCode(keyCode.replace(/[^\n]/g, "")); | ||||
|           } | ||||
|           // key
 | ||||
|           this.tokens.removeToken(); | ||||
|           // =
 | ||||
|           this.tokens.removeToken(); | ||||
|           const snapshot = this.tokens.snapshot(); | ||||
|           this.processPropValue(); | ||||
|           keyCode = this.tokens.dangerouslyGetAndRemoveCodeSinceSnapshot(snapshot); | ||||
|           // Don't add a comma
 | ||||
|           continue; | ||||
|         } else { | ||||
|           this.processPropName(propName); | ||||
|           this.tokens.replaceToken(": "); | ||||
|           this.processPropValue(); | ||||
|         } | ||||
|       } else if (this.tokens.matches1(tt.jsxName)) { | ||||
|         // This is a shorthand prop like <input disabled />.
 | ||||
|         const propName = this.tokens.identifierName(); | ||||
|         this.processPropName(propName); | ||||
|         this.tokens.appendCode(": true"); | ||||
|       } else if (this.tokens.matches1(tt.braceL)) { | ||||
|         // This is prop spread, like <div {...getProps()}>, which we can pass
 | ||||
|         // through fairly directly as an object spread.
 | ||||
|         this.tokens.replaceToken(""); | ||||
|         this.rootTransformer.processBalancedCode(); | ||||
|         this.tokens.replaceToken(""); | ||||
|       } else { | ||||
|         break; | ||||
|       } | ||||
|       this.tokens.appendCode(","); | ||||
|     } | ||||
|     return keyCode; | ||||
|   } | ||||
| 
 | ||||
|   processPropName(propName) { | ||||
|     if (propName.includes("-")) { | ||||
|       this.tokens.replaceToken(`'${propName}'`); | ||||
|     } else { | ||||
|       this.tokens.copyToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   processPropValue() { | ||||
|     if (this.tokens.matches1(tt.braceL)) { | ||||
|       this.tokens.replaceToken(""); | ||||
|       this.rootTransformer.processBalancedCode(); | ||||
|       this.tokens.replaceToken(""); | ||||
|     } else if (this.tokens.matches1(tt.jsxTagStart)) { | ||||
|       this.processJSXTag(); | ||||
|     } else { | ||||
|       this.processStringPropValue(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   processStringPropValue() { | ||||
|     const token = this.tokens.currentToken(); | ||||
|     const valueCode = this.tokens.code.slice(token.start + 1, token.end - 1); | ||||
|     const replacementCode = formatJSXTextReplacement(valueCode); | ||||
|     const literalCode = formatJSXStringValueLiteral(valueCode); | ||||
|     this.tokens.replaceToken(literalCode + replacementCode); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Starting in the middle of the props object literal, produce an additional | ||||
|    * prop for the children and close the object literal. | ||||
|    */ | ||||
|   processAutomaticChildrenAndEndProps(jsxRole) { | ||||
|     if (jsxRole === JSXRole.StaticChildren) { | ||||
|       this.tokens.appendCode(" children: ["); | ||||
|       this.processChildren(false); | ||||
|       this.tokens.appendCode("]}"); | ||||
|     } else { | ||||
|       // The parser information tells us whether we will see a real child or if
 | ||||
|       // all remaining children (if any) will resolve to empty. If there are no
 | ||||
|       // non-empty children, don't emit a children prop at all, but still
 | ||||
|       // process children so that we properly transform the code into nothing.
 | ||||
|       if (jsxRole === JSXRole.OneChild) { | ||||
|         this.tokens.appendCode(" children: "); | ||||
|       } | ||||
|       this.processChildren(false); | ||||
|       this.tokens.appendCode("}"); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform children into a comma-separated list, which will be either | ||||
|    * arguments to createElement or array elements of a children prop. | ||||
|    */ | ||||
|   processChildren(needsInitialComma) { | ||||
|     let needsComma = needsInitialComma; | ||||
|     while (true) { | ||||
|       if (this.tokens.matches2(tt.jsxTagStart, tt.slash)) { | ||||
|         // Closing tag, so no more children.
 | ||||
|         return; | ||||
|       } | ||||
|       let didEmitElement = false; | ||||
|       if (this.tokens.matches1(tt.braceL)) { | ||||
|         if (this.tokens.matches2(tt.braceL, tt.braceR)) { | ||||
|           // Empty interpolations and comment-only interpolations are allowed
 | ||||
|           // and don't create an extra child arg.
 | ||||
|           this.tokens.replaceToken(""); | ||||
|           this.tokens.replaceToken(""); | ||||
|         } else { | ||||
|           // Interpolated expression.
 | ||||
|           this.tokens.replaceToken(needsComma ? ", " : ""); | ||||
|           this.rootTransformer.processBalancedCode(); | ||||
|           this.tokens.replaceToken(""); | ||||
|           didEmitElement = true; | ||||
|         } | ||||
|       } else if (this.tokens.matches1(tt.jsxTagStart)) { | ||||
|         // Child JSX element
 | ||||
|         this.tokens.appendCode(needsComma ? ", " : ""); | ||||
|         this.processJSXTag(); | ||||
|         didEmitElement = true; | ||||
|       } else if (this.tokens.matches1(tt.jsxText) || this.tokens.matches1(tt.jsxEmptyText)) { | ||||
|         didEmitElement = this.processChildTextElement(needsComma); | ||||
|       } else { | ||||
|         throw new Error("Unexpected token when processing JSX children."); | ||||
|       } | ||||
|       if (didEmitElement) { | ||||
|         needsComma = true; | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Turn a JSX text element into a string literal, or nothing at all if the JSX | ||||
|    * text resolves to the empty string. | ||||
|    * | ||||
|    * Returns true if a string literal is emitted, false otherwise. | ||||
|    */ | ||||
|   processChildTextElement(needsComma) { | ||||
|     const token = this.tokens.currentToken(); | ||||
|     const valueCode = this.tokens.code.slice(token.start, token.end); | ||||
|     const replacementCode = formatJSXTextReplacement(valueCode); | ||||
|     const literalCode = formatJSXTextLiteral(valueCode); | ||||
|     if (literalCode === '""') { | ||||
|       this.tokens.replaceToken(replacementCode); | ||||
|       return false; | ||||
|     } else { | ||||
|       this.tokens.replaceToken(`${needsComma ? ", " : ""}${literalCode}${replacementCode}`); | ||||
|       return true; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   getDevSource(elementLocationCode) { | ||||
|     return `{fileName: ${this.getFilenameVarName()}, ${elementLocationCode}}`; | ||||
|   } | ||||
| 
 | ||||
|   getFilenameVarName() { | ||||
|     if (!this.filenameVarName) { | ||||
|       this.filenameVarName = this.nameManager.claimFreeName("_jsxFileName"); | ||||
|     } | ||||
|     return this.filenameVarName; | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Spec for identifiers: https://tc39.github.io/ecma262/#prod-IdentifierStart.
 | ||||
|  * | ||||
|  * Really only treat anything starting with a-z as tag names.  `_`, `$`, `é` | ||||
|  * should be treated as component names | ||||
|  */ | ||||
| export function startsWithLowerCase(s) { | ||||
|   const firstChar = s.charCodeAt(0); | ||||
|   return firstChar >= charCodes.lowercaseA && firstChar <= charCodes.lowercaseZ; | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Turn the given jsxText string into a JS string literal. Leading and trailing | ||||
|  * whitespace on lines is removed, except immediately after the open-tag and | ||||
|  * before the close-tag. Empty lines are completely removed, and spaces are | ||||
|  * added between lines after that. | ||||
|  * | ||||
|  * We use JSON.stringify to introduce escape characters as necessary, and trim | ||||
|  * the start and end of each line and remove blank lines. | ||||
|  */ | ||||
| function formatJSXTextLiteral(text) { | ||||
|   let result = ""; | ||||
|   let whitespace = ""; | ||||
| 
 | ||||
|   let isInInitialLineWhitespace = false; | ||||
|   let seenNonWhitespace = false; | ||||
|   for (let i = 0; i < text.length; i++) { | ||||
|     const c = text[i]; | ||||
|     if (c === " " || c === "\t" || c === "\r") { | ||||
|       if (!isInInitialLineWhitespace) { | ||||
|         whitespace += c; | ||||
|       } | ||||
|     } else if (c === "\n") { | ||||
|       whitespace = ""; | ||||
|       isInInitialLineWhitespace = true; | ||||
|     } else { | ||||
|       if (seenNonWhitespace && isInInitialLineWhitespace) { | ||||
|         result += " "; | ||||
|       } | ||||
|       result += whitespace; | ||||
|       whitespace = ""; | ||||
|       if (c === "&") { | ||||
|         const {entity, newI} = processEntity(text, i + 1); | ||||
|         i = newI - 1; | ||||
|         result += entity; | ||||
|       } else { | ||||
|         result += c; | ||||
|       } | ||||
|       seenNonWhitespace = true; | ||||
|       isInInitialLineWhitespace = false; | ||||
|     } | ||||
|   } | ||||
|   if (!isInInitialLineWhitespace) { | ||||
|     result += whitespace; | ||||
|   } | ||||
|   return JSON.stringify(result); | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Produce the code that should be printed after the JSX text string literal, | ||||
|  * with most content removed, but all newlines preserved and all spacing at the | ||||
|  * end preserved. | ||||
|  */ | ||||
| function formatJSXTextReplacement(text) { | ||||
|   let numNewlines = 0; | ||||
|   let numSpaces = 0; | ||||
|   for (const c of text) { | ||||
|     if (c === "\n") { | ||||
|       numNewlines++; | ||||
|       numSpaces = 0; | ||||
|     } else if (c === " ") { | ||||
|       numSpaces++; | ||||
|     } | ||||
|   } | ||||
|   return "\n".repeat(numNewlines) + " ".repeat(numSpaces); | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Format a string in the value position of a JSX prop. | ||||
|  * | ||||
|  * Use the same implementation as convertAttribute from | ||||
|  * babel-helper-builder-react-jsx. | ||||
|  */ | ||||
| function formatJSXStringValueLiteral(text) { | ||||
|   let result = ""; | ||||
|   for (let i = 0; i < text.length; i++) { | ||||
|     const c = text[i]; | ||||
|     if (c === "\n") { | ||||
|       if (/\s/.test(text[i + 1])) { | ||||
|         result += " "; | ||||
|         while (i < text.length && /\s/.test(text[i + 1])) { | ||||
|           i++; | ||||
|         } | ||||
|       } else { | ||||
|         result += "\n"; | ||||
|       } | ||||
|     } else if (c === "&") { | ||||
|       const {entity, newI} = processEntity(text, i + 1); | ||||
|       result += entity; | ||||
|       i = newI - 1; | ||||
|     } else { | ||||
|       result += c; | ||||
|     } | ||||
|   } | ||||
|   return JSON.stringify(result); | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Starting at a &, see if there's an HTML entity (specified by name, decimal | ||||
|  * char code, or hex char code) and return it if so. | ||||
|  * | ||||
|  * Modified from jsxReadString in babel-parser. | ||||
|  */ | ||||
| function processEntity(text, indexAfterAmpersand) { | ||||
|   let str = ""; | ||||
|   let count = 0; | ||||
|   let entity; | ||||
|   let i = indexAfterAmpersand; | ||||
| 
 | ||||
|   if (text[i] === "#") { | ||||
|     let radix = 10; | ||||
|     i++; | ||||
|     let numStart; | ||||
|     if (text[i] === "x") { | ||||
|       radix = 16; | ||||
|       i++; | ||||
|       numStart = i; | ||||
|       while (i < text.length && isHexDigit(text.charCodeAt(i))) { | ||||
|         i++; | ||||
|       } | ||||
|     } else { | ||||
|       numStart = i; | ||||
|       while (i < text.length && isDecimalDigit(text.charCodeAt(i))) { | ||||
|         i++; | ||||
|       } | ||||
|     } | ||||
|     if (text[i] === ";") { | ||||
|       const numStr = text.slice(numStart, i); | ||||
|       if (numStr) { | ||||
|         i++; | ||||
|         entity = String.fromCodePoint(parseInt(numStr, radix)); | ||||
|       } | ||||
|     } | ||||
|   } else { | ||||
|     while (i < text.length && count++ < 10) { | ||||
|       const ch = text[i]; | ||||
|       i++; | ||||
|       if (ch === ";") { | ||||
|         entity = XHTMLEntities.get(str); | ||||
|         break; | ||||
|       } | ||||
|       str += ch; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   if (!entity) { | ||||
|     return {entity: "&", newI: indexAfterAmpersand}; | ||||
|   } | ||||
|   return {entity, newI: i}; | ||||
| } | ||||
| 
 | ||||
| function isDecimalDigit(code) { | ||||
|   return code >= charCodes.digit0 && code <= charCodes.digit9; | ||||
| } | ||||
| 
 | ||||
| function isHexDigit(code) { | ||||
|   return ( | ||||
|     (code >= charCodes.digit0 && code <= charCodes.digit9) || | ||||
|     (code >= charCodes.lowercaseA && code <= charCodes.lowercaseF) || | ||||
|     (code >= charCodes.uppercaseA && code <= charCodes.uppercaseF) | ||||
|   ); | ||||
| } | ||||
							
								
								
									
										111
									
								
								node_modules/sucrase/dist/esm/transformers/JestHoistTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										111
									
								
								node_modules/sucrase/dist/esm/transformers/JestHoistTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,111 @@ | |||
|  function _optionalChain(ops) { let lastAccessLHS = undefined; let value = ops[0]; let i = 1; while (i < ops.length) { const op = ops[i]; const fn = ops[i + 1]; i += 2; if ((op === 'optionalAccess' || op === 'optionalCall') && value == null) { return undefined; } if (op === 'access' || op === 'optionalAccess') { lastAccessLHS = value; value = fn(value); } else if (op === 'call' || op === 'optionalCall') { value = fn((...args) => value.call(lastAccessLHS, ...args)); lastAccessLHS = undefined; } } return value; } | ||||
| 
 | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| 
 | ||||
| import Transformer from "./Transformer"; | ||||
| 
 | ||||
| const JEST_GLOBAL_NAME = "jest"; | ||||
| const HOISTED_METHODS = ["mock", "unmock", "enableAutomock", "disableAutomock"]; | ||||
| 
 | ||||
| /** | ||||
|  * Implementation of babel-plugin-jest-hoist, which hoists up some jest method | ||||
|  * calls above the imports to allow them to override other imports. | ||||
|  * | ||||
|  * To preserve line numbers, rather than directly moving the jest.mock code, we | ||||
|  * wrap each invocation in a function statement and then call the function from | ||||
|  * the top of the file. | ||||
|  */ | ||||
| export default class JestHoistTransformer extends Transformer { | ||||
|     __init() {this.hoistedFunctionNames = []} | ||||
| 
 | ||||
|   constructor( | ||||
|      rootTransformer, | ||||
|      tokens, | ||||
|      nameManager, | ||||
|      importProcessor, | ||||
|   ) { | ||||
|     super();this.rootTransformer = rootTransformer;this.tokens = tokens;this.nameManager = nameManager;this.importProcessor = importProcessor;JestHoistTransformer.prototype.__init.call(this);; | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     if ( | ||||
|       this.tokens.currentToken().scopeDepth === 0 && | ||||
|       this.tokens.matches4(tt.name, tt.dot, tt.name, tt.parenL) && | ||||
|       this.tokens.identifierName() === JEST_GLOBAL_NAME | ||||
|     ) { | ||||
|       // TODO: This only works if imports transform is active, which it will be for jest.
 | ||||
|       //       But if jest adds module support and we no longer need the import transform, this needs fixing.
 | ||||
|       if (_optionalChain([this, 'access', _ => _.importProcessor, 'optionalAccess', _2 => _2.getGlobalNames, 'call', _3 => _3(), 'optionalAccess', _4 => _4.has, 'call', _5 => _5(JEST_GLOBAL_NAME)])) { | ||||
|         return false; | ||||
|       } | ||||
|       return this.extractHoistedCalls(); | ||||
|     } | ||||
| 
 | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   getHoistedCode() { | ||||
|     if (this.hoistedFunctionNames.length > 0) { | ||||
|       // This will be placed before module interop code, but that's fine since
 | ||||
|       // imports aren't allowed in module mock factories.
 | ||||
|       return this.hoistedFunctionNames.map((name) => `${name}();`).join(""); | ||||
|     } | ||||
|     return ""; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Extracts any methods calls on the jest-object that should be hoisted. | ||||
|    * | ||||
|    * According to the jest docs, https://jestjs.io/docs/en/jest-object#jestmockmodulename-factory-options,
 | ||||
|    * mock, unmock, enableAutomock, disableAutomock, are the methods that should be hoisted. | ||||
|    * | ||||
|    * We do not apply the same checks of the arguments as babel-plugin-jest-hoist does. | ||||
|    */ | ||||
|    extractHoistedCalls() { | ||||
|     // We're handling a chain of calls where `jest` may or may not need to be inserted for each call
 | ||||
|     // in the chain, so remove the initial `jest` to make the loop implementation cleaner.
 | ||||
|     this.tokens.removeToken(); | ||||
|     // Track some state so that multiple non-hoisted chained calls in a row keep their chaining
 | ||||
|     // syntax.
 | ||||
|     let followsNonHoistedJestCall = false; | ||||
| 
 | ||||
|     // Iterate through all chained calls on the jest object.
 | ||||
|     while (this.tokens.matches3(tt.dot, tt.name, tt.parenL)) { | ||||
|       const methodName = this.tokens.identifierNameAtIndex(this.tokens.currentIndex() + 1); | ||||
|       const shouldHoist = HOISTED_METHODS.includes(methodName); | ||||
|       if (shouldHoist) { | ||||
|         // We've matched e.g. `.mock(...)` or similar call.
 | ||||
|         // Replace the initial `.` with `function __jestHoist(){jest.`
 | ||||
|         const hoistedFunctionName = this.nameManager.claimFreeName("__jestHoist"); | ||||
|         this.hoistedFunctionNames.push(hoistedFunctionName); | ||||
|         this.tokens.replaceToken(`function ${hoistedFunctionName}(){${JEST_GLOBAL_NAME}.`); | ||||
|         this.tokens.copyToken(); | ||||
|         this.tokens.copyToken(); | ||||
|         this.rootTransformer.processBalancedCode(); | ||||
|         this.tokens.copyExpectedToken(tt.parenR); | ||||
|         this.tokens.appendCode(";}"); | ||||
|         followsNonHoistedJestCall = false; | ||||
|       } else { | ||||
|         // This is a non-hoisted method, so just transform the code as usual.
 | ||||
|         if (followsNonHoistedJestCall) { | ||||
|           // If we didn't hoist the previous call, we can leave the code as-is to chain off of the
 | ||||
|           // previous method call. It's important to preserve the code here because we don't know
 | ||||
|           // for sure that the method actually returned the jest object for chaining.
 | ||||
|           this.tokens.copyToken(); | ||||
|         } else { | ||||
|           // If we hoisted the previous call, we know it returns the jest object back, so we insert
 | ||||
|           // the identifier `jest` to continue the chain.
 | ||||
|           this.tokens.replaceToken(`${JEST_GLOBAL_NAME}.`); | ||||
|         } | ||||
|         this.tokens.copyToken(); | ||||
|         this.tokens.copyToken(); | ||||
|         this.rootTransformer.processBalancedCode(); | ||||
|         this.tokens.copyExpectedToken(tt.parenR); | ||||
|         followsNonHoistedJestCall = true; | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     return true; | ||||
|   } | ||||
| } | ||||
							
								
								
									
										20
									
								
								node_modules/sucrase/dist/esm/transformers/NumericSeparatorTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										20
									
								
								node_modules/sucrase/dist/esm/transformers/NumericSeparatorTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,20 @@ | |||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| import Transformer from "./Transformer"; | ||||
| 
 | ||||
| export default class NumericSeparatorTransformer extends Transformer { | ||||
|   constructor( tokens) { | ||||
|     super();this.tokens = tokens;; | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     if (this.tokens.matches1(tt.num)) { | ||||
|       const code = this.tokens.currentTokenCode(); | ||||
|       if (code.includes("_")) { | ||||
|         this.tokens.replaceToken(code.replace(/_/g, "")); | ||||
|         return true; | ||||
|       } | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| } | ||||
							
								
								
									
										19
									
								
								node_modules/sucrase/dist/esm/transformers/OptionalCatchBindingTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										19
									
								
								node_modules/sucrase/dist/esm/transformers/OptionalCatchBindingTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,19 @@ | |||
| 
 | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| import Transformer from "./Transformer"; | ||||
| 
 | ||||
| export default class OptionalCatchBindingTransformer extends Transformer { | ||||
|   constructor( tokens,  nameManager) { | ||||
|     super();this.tokens = tokens;this.nameManager = nameManager;; | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     if (this.tokens.matches2(tt._catch, tt.braceL)) { | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.appendCode(` (${this.nameManager.claimFreeName("e")})`); | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| } | ||||
							
								
								
									
										155
									
								
								node_modules/sucrase/dist/esm/transformers/OptionalChainingNullishTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										155
									
								
								node_modules/sucrase/dist/esm/transformers/OptionalChainingNullishTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,155 @@ | |||
| 
 | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| import Transformer from "./Transformer"; | ||||
| 
 | ||||
| /** | ||||
|  * Transformer supporting the optional chaining and nullish coalescing operators. | ||||
|  * | ||||
|  * Tech plan here: | ||||
|  * https://github.com/alangpierce/sucrase/wiki/Sucrase-Optional-Chaining-and-Nullish-Coalescing-Technical-Plan
 | ||||
|  * | ||||
|  * The prefix and suffix code snippets are handled by TokenProcessor, and this transformer handles | ||||
|  * the operators themselves. | ||||
|  */ | ||||
| export default class OptionalChainingNullishTransformer extends Transformer { | ||||
|   constructor( tokens,  nameManager) { | ||||
|     super();this.tokens = tokens;this.nameManager = nameManager;; | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     if (this.tokens.matches1(tt.nullishCoalescing)) { | ||||
|       const token = this.tokens.currentToken(); | ||||
|       if (this.tokens.tokens[token.nullishStartIndex].isAsyncOperation) { | ||||
|         this.tokens.replaceTokenTrimmingLeftWhitespace(", async () => ("); | ||||
|       } else { | ||||
|         this.tokens.replaceTokenTrimmingLeftWhitespace(", () => ("); | ||||
|       } | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches1(tt._delete)) { | ||||
|       const nextToken = this.tokens.tokenAtRelativeIndex(1); | ||||
|       if (nextToken.isOptionalChainStart) { | ||||
|         this.tokens.removeInitialToken(); | ||||
|         return true; | ||||
|       } | ||||
|     } | ||||
|     const token = this.tokens.currentToken(); | ||||
|     const chainStart = token.subscriptStartIndex; | ||||
|     if ( | ||||
|       chainStart != null && | ||||
|       this.tokens.tokens[chainStart].isOptionalChainStart && | ||||
|       // Super subscripts can't be optional (since super is never null/undefined), and the syntax
 | ||||
|       // relies on the subscript being intact, so leave this token alone.
 | ||||
|       this.tokens.tokenAtRelativeIndex(-1).type !== tt._super | ||||
|     ) { | ||||
|       const param = this.nameManager.claimFreeName("_"); | ||||
|       let arrowStartSnippet; | ||||
|       if ( | ||||
|         chainStart > 0 && | ||||
|         this.tokens.matches1AtIndex(chainStart - 1, tt._delete) && | ||||
|         this.isLastSubscriptInChain() | ||||
|       ) { | ||||
|         // Delete operations are special: we already removed the delete keyword, and to still
 | ||||
|         // perform a delete, we need to insert a delete in the very last part of the chain, which
 | ||||
|         // in correct code will always be a property access.
 | ||||
|         arrowStartSnippet = `${param} => delete ${param}`; | ||||
|       } else { | ||||
|         arrowStartSnippet = `${param} => ${param}`; | ||||
|       } | ||||
|       if (this.tokens.tokens[chainStart].isAsyncOperation) { | ||||
|         arrowStartSnippet = `async ${arrowStartSnippet}`; | ||||
|       } | ||||
|       if ( | ||||
|         this.tokens.matches2(tt.questionDot, tt.parenL) || | ||||
|         this.tokens.matches2(tt.questionDot, tt.lessThan) | ||||
|       ) { | ||||
|         if (this.justSkippedSuper()) { | ||||
|           this.tokens.appendCode(".bind(this)"); | ||||
|         } | ||||
|         this.tokens.replaceTokenTrimmingLeftWhitespace(`, 'optionalCall', ${arrowStartSnippet}`); | ||||
|       } else if (this.tokens.matches2(tt.questionDot, tt.bracketL)) { | ||||
|         this.tokens.replaceTokenTrimmingLeftWhitespace(`, 'optionalAccess', ${arrowStartSnippet}`); | ||||
|       } else if (this.tokens.matches1(tt.questionDot)) { | ||||
|         this.tokens.replaceTokenTrimmingLeftWhitespace(`, 'optionalAccess', ${arrowStartSnippet}.`); | ||||
|       } else if (this.tokens.matches1(tt.dot)) { | ||||
|         this.tokens.replaceTokenTrimmingLeftWhitespace(`, 'access', ${arrowStartSnippet}.`); | ||||
|       } else if (this.tokens.matches1(tt.bracketL)) { | ||||
|         this.tokens.replaceTokenTrimmingLeftWhitespace(`, 'access', ${arrowStartSnippet}[`); | ||||
|       } else if (this.tokens.matches1(tt.parenL)) { | ||||
|         if (this.justSkippedSuper()) { | ||||
|           this.tokens.appendCode(".bind(this)"); | ||||
|         } | ||||
|         this.tokens.replaceTokenTrimmingLeftWhitespace(`, 'call', ${arrowStartSnippet}(`); | ||||
|       } else { | ||||
|         throw new Error("Unexpected subscript operator in optional chain."); | ||||
|       } | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Determine if the current token is the last of its chain, so that we know whether it's eligible | ||||
|    * to have a delete op inserted. | ||||
|    * | ||||
|    * We can do this by walking forward until we determine one way or another. Each | ||||
|    * isOptionalChainStart token must be paired with exactly one isOptionalChainEnd token after it in | ||||
|    * a nesting way, so we can track depth and walk to the end of the chain (the point where the | ||||
|    * depth goes negative) and see if any other subscript token is after us in the chain. | ||||
|    */ | ||||
|   isLastSubscriptInChain() { | ||||
|     let depth = 0; | ||||
|     for (let i = this.tokens.currentIndex() + 1; ; i++) { | ||||
|       if (i >= this.tokens.tokens.length) { | ||||
|         throw new Error("Reached the end of the code while finding the end of the access chain."); | ||||
|       } | ||||
|       if (this.tokens.tokens[i].isOptionalChainStart) { | ||||
|         depth++; | ||||
|       } else if (this.tokens.tokens[i].isOptionalChainEnd) { | ||||
|         depth--; | ||||
|       } | ||||
|       if (depth < 0) { | ||||
|         return true; | ||||
|       } | ||||
| 
 | ||||
|       // This subscript token is a later one in the same chain.
 | ||||
|       if (depth === 0 && this.tokens.tokens[i].subscriptStartIndex != null) { | ||||
|         return false; | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Determine if we are the open-paren in an expression like super.a()?.b. | ||||
|    * | ||||
|    * We can do this by walking backward to find the previous subscript. If that subscript was | ||||
|    * preceded by a super, then we must be the subscript after it, so if this is a call expression, | ||||
|    * we'll need to attach the right context. | ||||
|    */ | ||||
|   justSkippedSuper() { | ||||
|     let depth = 0; | ||||
|     let index = this.tokens.currentIndex() - 1; | ||||
|     while (true) { | ||||
|       if (index < 0) { | ||||
|         throw new Error( | ||||
|           "Reached the start of the code while finding the start of the access chain.", | ||||
|         ); | ||||
|       } | ||||
|       if (this.tokens.tokens[index].isOptionalChainStart) { | ||||
|         depth--; | ||||
|       } else if (this.tokens.tokens[index].isOptionalChainEnd) { | ||||
|         depth++; | ||||
|       } | ||||
|       if (depth < 0) { | ||||
|         return false; | ||||
|       } | ||||
| 
 | ||||
|       // This subscript token is a later one in the same chain.
 | ||||
|       if (depth === 0 && this.tokens.tokens[index].subscriptStartIndex != null) { | ||||
|         return this.tokens.tokens[index - 1].type === tt._super; | ||||
|       } | ||||
|       index--; | ||||
|     } | ||||
|   } | ||||
| } | ||||
							
								
								
									
										160
									
								
								node_modules/sucrase/dist/esm/transformers/ReactDisplayNameTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										160
									
								
								node_modules/sucrase/dist/esm/transformers/ReactDisplayNameTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,160 @@ | |||
| 
 | ||||
| 
 | ||||
| import {IdentifierRole} from "../parser/tokenizer"; | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| 
 | ||||
| import Transformer from "./Transformer"; | ||||
| 
 | ||||
| /** | ||||
|  * Implementation of babel-plugin-transform-react-display-name, which adds a | ||||
|  * display name to usages of React.createClass and createReactClass. | ||||
|  */ | ||||
| export default class ReactDisplayNameTransformer extends Transformer { | ||||
|   constructor( | ||||
|      rootTransformer, | ||||
|      tokens, | ||||
|      importProcessor, | ||||
|      options, | ||||
|   ) { | ||||
|     super();this.rootTransformer = rootTransformer;this.tokens = tokens;this.importProcessor = importProcessor;this.options = options;; | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     const startIndex = this.tokens.currentIndex(); | ||||
|     if (this.tokens.identifierName() === "createReactClass") { | ||||
|       const newName = | ||||
|         this.importProcessor && this.importProcessor.getIdentifierReplacement("createReactClass"); | ||||
|       if (newName) { | ||||
|         this.tokens.replaceToken(`(0, ${newName})`); | ||||
|       } else { | ||||
|         this.tokens.copyToken(); | ||||
|       } | ||||
|       this.tryProcessCreateClassCall(startIndex); | ||||
|       return true; | ||||
|     } | ||||
|     if ( | ||||
|       this.tokens.matches3(tt.name, tt.dot, tt.name) && | ||||
|       this.tokens.identifierName() === "React" && | ||||
|       this.tokens.identifierNameAtIndex(this.tokens.currentIndex() + 2) === "createClass" | ||||
|     ) { | ||||
|       const newName = this.importProcessor | ||||
|         ? this.importProcessor.getIdentifierReplacement("React") || "React" | ||||
|         : "React"; | ||||
|       if (newName) { | ||||
|         this.tokens.replaceToken(newName); | ||||
|         this.tokens.copyToken(); | ||||
|         this.tokens.copyToken(); | ||||
|       } else { | ||||
|         this.tokens.copyToken(); | ||||
|         this.tokens.copyToken(); | ||||
|         this.tokens.copyToken(); | ||||
|       } | ||||
|       this.tryProcessCreateClassCall(startIndex); | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * This is called with the token position at the open-paren. | ||||
|    */ | ||||
|    tryProcessCreateClassCall(startIndex) { | ||||
|     const displayName = this.findDisplayName(startIndex); | ||||
|     if (!displayName) { | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     if (this.classNeedsDisplayName()) { | ||||
|       this.tokens.copyExpectedToken(tt.parenL); | ||||
|       this.tokens.copyExpectedToken(tt.braceL); | ||||
|       this.tokens.appendCode(`displayName: '${displayName}',`); | ||||
|       this.rootTransformer.processBalancedCode(); | ||||
|       this.tokens.copyExpectedToken(tt.braceR); | ||||
|       this.tokens.copyExpectedToken(tt.parenR); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    findDisplayName(startIndex) { | ||||
|     if (startIndex < 2) { | ||||
|       return null; | ||||
|     } | ||||
|     if (this.tokens.matches2AtIndex(startIndex - 2, tt.name, tt.eq)) { | ||||
|       // This is an assignment (or declaration) and the LHS is either an identifier or a member
 | ||||
|       // expression ending in an identifier, so use that identifier name.
 | ||||
|       return this.tokens.identifierNameAtIndex(startIndex - 2); | ||||
|     } | ||||
|     if ( | ||||
|       startIndex >= 2 && | ||||
|       this.tokens.tokens[startIndex - 2].identifierRole === IdentifierRole.ObjectKey | ||||
|     ) { | ||||
|       // This is an object literal value.
 | ||||
|       return this.tokens.identifierNameAtIndex(startIndex - 2); | ||||
|     } | ||||
|     if (this.tokens.matches2AtIndex(startIndex - 2, tt._export, tt._default)) { | ||||
|       return this.getDisplayNameFromFilename(); | ||||
|     } | ||||
|     return null; | ||||
|   } | ||||
| 
 | ||||
|    getDisplayNameFromFilename() { | ||||
|     const filePath = this.options.filePath || "unknown"; | ||||
|     const pathSegments = filePath.split("/"); | ||||
|     const filename = pathSegments[pathSegments.length - 1]; | ||||
|     const dotIndex = filename.lastIndexOf("."); | ||||
|     const baseFilename = dotIndex === -1 ? filename : filename.slice(0, dotIndex); | ||||
|     if (baseFilename === "index" && pathSegments[pathSegments.length - 2]) { | ||||
|       return pathSegments[pathSegments.length - 2]; | ||||
|     } else { | ||||
|       return baseFilename; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * We only want to add a display name when this is a function call containing | ||||
|    * one argument, which is an object literal without `displayName` as an | ||||
|    * existing key. | ||||
|    */ | ||||
|    classNeedsDisplayName() { | ||||
|     let index = this.tokens.currentIndex(); | ||||
|     if (!this.tokens.matches2(tt.parenL, tt.braceL)) { | ||||
|       return false; | ||||
|     } | ||||
|     // The block starts on the {, and we expect any displayName key to be in
 | ||||
|     // that context. We need to ignore other other contexts to avoid matching
 | ||||
|     // nested displayName keys.
 | ||||
|     const objectStartIndex = index + 1; | ||||
|     const objectContextId = this.tokens.tokens[objectStartIndex].contextId; | ||||
|     if (objectContextId == null) { | ||||
|       throw new Error("Expected non-null context ID on object open-brace."); | ||||
|     } | ||||
| 
 | ||||
|     for (; index < this.tokens.tokens.length; index++) { | ||||
|       const token = this.tokens.tokens[index]; | ||||
|       if (token.type === tt.braceR && token.contextId === objectContextId) { | ||||
|         index++; | ||||
|         break; | ||||
|       } | ||||
| 
 | ||||
|       if ( | ||||
|         this.tokens.identifierNameAtIndex(index) === "displayName" && | ||||
|         this.tokens.tokens[index].identifierRole === IdentifierRole.ObjectKey && | ||||
|         token.contextId === objectContextId | ||||
|       ) { | ||||
|         // We found a displayName key, so bail out.
 | ||||
|         return false; | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     if (index === this.tokens.tokens.length) { | ||||
|       throw new Error("Unexpected end of input when processing React class."); | ||||
|     } | ||||
| 
 | ||||
|     // If we got this far, we know we have createClass with an object with no
 | ||||
|     // display name, so we want to proceed as long as that was the only argument.
 | ||||
|     return ( | ||||
|       this.tokens.matches1AtIndex(index, tt.parenR) || | ||||
|       this.tokens.matches2AtIndex(index, tt.comma, tt.parenR) | ||||
|     ); | ||||
|   } | ||||
| } | ||||
							
								
								
									
										69
									
								
								node_modules/sucrase/dist/esm/transformers/ReactHotLoaderTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										69
									
								
								node_modules/sucrase/dist/esm/transformers/ReactHotLoaderTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,69 @@ | |||
| import {IdentifierRole, isTopLevelDeclaration} from "../parser/tokenizer"; | ||||
| 
 | ||||
| import Transformer from "./Transformer"; | ||||
| 
 | ||||
| export default class ReactHotLoaderTransformer extends Transformer { | ||||
|    __init() {this.extractedDefaultExportName = null} | ||||
| 
 | ||||
|   constructor( tokens,  filePath) { | ||||
|     super();this.tokens = tokens;this.filePath = filePath;ReactHotLoaderTransformer.prototype.__init.call(this);; | ||||
|   } | ||||
| 
 | ||||
|   setExtractedDefaultExportName(extractedDefaultExportName) { | ||||
|     this.extractedDefaultExportName = extractedDefaultExportName; | ||||
|   } | ||||
| 
 | ||||
|   getPrefixCode() { | ||||
|     return ` | ||||
|       (function () { | ||||
|         var enterModule = require('react-hot-loader').enterModule; | ||||
|         enterModule && enterModule(module); | ||||
|       })();` | ||||
|       .replace(/\s+/g, " ") | ||||
|       .trim(); | ||||
|   } | ||||
| 
 | ||||
|   getSuffixCode() { | ||||
|     const topLevelNames = new Set(); | ||||
|     for (const token of this.tokens.tokens) { | ||||
|       if ( | ||||
|         !token.isType && | ||||
|         isTopLevelDeclaration(token) && | ||||
|         token.identifierRole !== IdentifierRole.ImportDeclaration | ||||
|       ) { | ||||
|         topLevelNames.add(this.tokens.identifierNameForToken(token)); | ||||
|       } | ||||
|     } | ||||
|     const namesToRegister = Array.from(topLevelNames).map((name) => ({ | ||||
|       variableName: name, | ||||
|       uniqueLocalName: name, | ||||
|     })); | ||||
|     if (this.extractedDefaultExportName) { | ||||
|       namesToRegister.push({ | ||||
|         variableName: this.extractedDefaultExportName, | ||||
|         uniqueLocalName: "default", | ||||
|       }); | ||||
|     } | ||||
|     return ` | ||||
| ;(function () { | ||||
|   var reactHotLoader = require('react-hot-loader').default; | ||||
|   var leaveModule = require('react-hot-loader').leaveModule; | ||||
|   if (!reactHotLoader) { | ||||
|     return; | ||||
|   } | ||||
| ${namesToRegister | ||||
|   .map( | ||||
|     ({variableName, uniqueLocalName}) => | ||||
|       `  reactHotLoader.register(${variableName}, "${uniqueLocalName}", ${JSON.stringify( | ||||
|         this.filePath || "", | ||||
|       )});`,
 | ||||
|   ) | ||||
|   .join("\n")} | ||||
|   leaveModule(module); | ||||
| })();`;
 | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     return false; | ||||
|   } | ||||
| } | ||||
							
								
								
									
										462
									
								
								node_modules/sucrase/dist/esm/transformers/RootTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										462
									
								
								node_modules/sucrase/dist/esm/transformers/RootTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,462 @@ | |||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| import {ContextualKeyword} from "../parser/tokenizer/keywords"; | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| import getClassInfo, {} from "../util/getClassInfo"; | ||||
| import CJSImportTransformer from "./CJSImportTransformer"; | ||||
| import ESMImportTransformer from "./ESMImportTransformer"; | ||||
| import FlowTransformer from "./FlowTransformer"; | ||||
| import JestHoistTransformer from "./JestHoistTransformer"; | ||||
| import JSXTransformer from "./JSXTransformer"; | ||||
| import NumericSeparatorTransformer from "./NumericSeparatorTransformer"; | ||||
| import OptionalCatchBindingTransformer from "./OptionalCatchBindingTransformer"; | ||||
| import OptionalChainingNullishTransformer from "./OptionalChainingNullishTransformer"; | ||||
| import ReactDisplayNameTransformer from "./ReactDisplayNameTransformer"; | ||||
| import ReactHotLoaderTransformer from "./ReactHotLoaderTransformer"; | ||||
| 
 | ||||
| import TypeScriptTransformer from "./TypeScriptTransformer"; | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| export default class RootTransformer { | ||||
|    __init() {this.transformers = []} | ||||
|    | ||||
|    | ||||
|    __init2() {this.generatedVariables = []} | ||||
|    | ||||
|    | ||||
|    | ||||
|    | ||||
| 
 | ||||
|   constructor( | ||||
|     sucraseContext, | ||||
|     transforms, | ||||
|     enableLegacyBabel5ModuleInterop, | ||||
|     options, | ||||
|   ) {;RootTransformer.prototype.__init.call(this);RootTransformer.prototype.__init2.call(this); | ||||
|     this.nameManager = sucraseContext.nameManager; | ||||
|     this.helperManager = sucraseContext.helperManager; | ||||
|     const {tokenProcessor, importProcessor} = sucraseContext; | ||||
|     this.tokens = tokenProcessor; | ||||
|     this.isImportsTransformEnabled = transforms.includes("imports"); | ||||
|     this.isReactHotLoaderTransformEnabled = transforms.includes("react-hot-loader"); | ||||
|     this.disableESTransforms = Boolean(options.disableESTransforms); | ||||
| 
 | ||||
|     if (!options.disableESTransforms) { | ||||
|       this.transformers.push( | ||||
|         new OptionalChainingNullishTransformer(tokenProcessor, this.nameManager), | ||||
|       ); | ||||
|       this.transformers.push(new NumericSeparatorTransformer(tokenProcessor)); | ||||
|       this.transformers.push(new OptionalCatchBindingTransformer(tokenProcessor, this.nameManager)); | ||||
|     } | ||||
| 
 | ||||
|     if (transforms.includes("jsx")) { | ||||
|       if (options.jsxRuntime !== "preserve") { | ||||
|         this.transformers.push( | ||||
|           new JSXTransformer(this, tokenProcessor, importProcessor, this.nameManager, options), | ||||
|         ); | ||||
|       } | ||||
|       this.transformers.push( | ||||
|         new ReactDisplayNameTransformer(this, tokenProcessor, importProcessor, options), | ||||
|       ); | ||||
|     } | ||||
| 
 | ||||
|     let reactHotLoaderTransformer = null; | ||||
|     if (transforms.includes("react-hot-loader")) { | ||||
|       if (!options.filePath) { | ||||
|         throw new Error("filePath is required when using the react-hot-loader transform."); | ||||
|       } | ||||
|       reactHotLoaderTransformer = new ReactHotLoaderTransformer(tokenProcessor, options.filePath); | ||||
|       this.transformers.push(reactHotLoaderTransformer); | ||||
|     } | ||||
| 
 | ||||
|     // Note that we always want to enable the imports transformer, even when the import transform
 | ||||
|     // itself isn't enabled, since we need to do type-only import pruning for both Flow and
 | ||||
|     // TypeScript.
 | ||||
|     if (transforms.includes("imports")) { | ||||
|       if (importProcessor === null) { | ||||
|         throw new Error("Expected non-null importProcessor with imports transform enabled."); | ||||
|       } | ||||
|       this.transformers.push( | ||||
|         new CJSImportTransformer( | ||||
|           this, | ||||
|           tokenProcessor, | ||||
|           importProcessor, | ||||
|           this.nameManager, | ||||
|           this.helperManager, | ||||
|           reactHotLoaderTransformer, | ||||
|           enableLegacyBabel5ModuleInterop, | ||||
|           Boolean(options.enableLegacyTypeScriptModuleInterop), | ||||
|           transforms.includes("typescript"), | ||||
|           transforms.includes("flow"), | ||||
|           Boolean(options.preserveDynamicImport), | ||||
|           Boolean(options.keepUnusedImports), | ||||
|         ), | ||||
|       ); | ||||
|     } else { | ||||
|       this.transformers.push( | ||||
|         new ESMImportTransformer( | ||||
|           tokenProcessor, | ||||
|           this.nameManager, | ||||
|           this.helperManager, | ||||
|           reactHotLoaderTransformer, | ||||
|           transforms.includes("typescript"), | ||||
|           transforms.includes("flow"), | ||||
|           Boolean(options.keepUnusedImports), | ||||
|           options, | ||||
|         ), | ||||
|       ); | ||||
|     } | ||||
| 
 | ||||
|     if (transforms.includes("flow")) { | ||||
|       this.transformers.push( | ||||
|         new FlowTransformer(this, tokenProcessor, transforms.includes("imports")), | ||||
|       ); | ||||
|     } | ||||
|     if (transforms.includes("typescript")) { | ||||
|       this.transformers.push( | ||||
|         new TypeScriptTransformer(this, tokenProcessor, transforms.includes("imports")), | ||||
|       ); | ||||
|     } | ||||
|     if (transforms.includes("jest")) { | ||||
|       this.transformers.push( | ||||
|         new JestHoistTransformer(this, tokenProcessor, this.nameManager, importProcessor), | ||||
|       ); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   transform() { | ||||
|     this.tokens.reset(); | ||||
|     this.processBalancedCode(); | ||||
|     const shouldAddUseStrict = this.isImportsTransformEnabled; | ||||
|     // "use strict" always needs to be first, so override the normal transformer order.
 | ||||
|     let prefix = shouldAddUseStrict ? '"use strict";' : ""; | ||||
|     for (const transformer of this.transformers) { | ||||
|       prefix += transformer.getPrefixCode(); | ||||
|     } | ||||
|     prefix += this.helperManager.emitHelpers(); | ||||
|     prefix += this.generatedVariables.map((v) => ` var ${v};`).join(""); | ||||
|     for (const transformer of this.transformers) { | ||||
|       prefix += transformer.getHoistedCode(); | ||||
|     } | ||||
|     let suffix = ""; | ||||
|     for (const transformer of this.transformers) { | ||||
|       suffix += transformer.getSuffixCode(); | ||||
|     } | ||||
|     const result = this.tokens.finish(); | ||||
|     let {code} = result; | ||||
|     if (code.startsWith("#!")) { | ||||
|       let newlineIndex = code.indexOf("\n"); | ||||
|       if (newlineIndex === -1) { | ||||
|         newlineIndex = code.length; | ||||
|         code += "\n"; | ||||
|       } | ||||
|       return { | ||||
|         code: code.slice(0, newlineIndex + 1) + prefix + code.slice(newlineIndex + 1) + suffix, | ||||
|         // The hashbang line has no tokens, so shifting the tokens to account
 | ||||
|         // for prefix can happen normally.
 | ||||
|         mappings: this.shiftMappings(result.mappings, prefix.length), | ||||
|       }; | ||||
|     } else { | ||||
|       return { | ||||
|         code: prefix + code + suffix, | ||||
|         mappings: this.shiftMappings(result.mappings, prefix.length), | ||||
|       }; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   processBalancedCode() { | ||||
|     let braceDepth = 0; | ||||
|     let parenDepth = 0; | ||||
|     while (!this.tokens.isAtEnd()) { | ||||
|       if (this.tokens.matches1(tt.braceL) || this.tokens.matches1(tt.dollarBraceL)) { | ||||
|         braceDepth++; | ||||
|       } else if (this.tokens.matches1(tt.braceR)) { | ||||
|         if (braceDepth === 0) { | ||||
|           return; | ||||
|         } | ||||
|         braceDepth--; | ||||
|       } | ||||
|       if (this.tokens.matches1(tt.parenL)) { | ||||
|         parenDepth++; | ||||
|       } else if (this.tokens.matches1(tt.parenR)) { | ||||
|         if (parenDepth === 0) { | ||||
|           return; | ||||
|         } | ||||
|         parenDepth--; | ||||
|       } | ||||
|       this.processToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   processToken() { | ||||
|     if (this.tokens.matches1(tt._class)) { | ||||
|       this.processClass(); | ||||
|       return; | ||||
|     } | ||||
|     for (const transformer of this.transformers) { | ||||
|       const wasProcessed = transformer.process(); | ||||
|       if (wasProcessed) { | ||||
|         return; | ||||
|       } | ||||
|     } | ||||
|     this.tokens.copyToken(); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Skip past a class with a name and return that name. | ||||
|    */ | ||||
|   processNamedClass() { | ||||
|     if (!this.tokens.matches2(tt._class, tt.name)) { | ||||
|       throw new Error("Expected identifier for exported class name."); | ||||
|     } | ||||
|     const name = this.tokens.identifierNameAtIndex(this.tokens.currentIndex() + 1); | ||||
|     this.processClass(); | ||||
|     return name; | ||||
|   } | ||||
| 
 | ||||
|   processClass() { | ||||
|     const classInfo = getClassInfo(this, this.tokens, this.nameManager, this.disableESTransforms); | ||||
| 
 | ||||
|     // Both static and instance initializers need a class name to use to invoke the initializer, so
 | ||||
|     // assign to one if necessary.
 | ||||
|     const needsCommaExpression = | ||||
|       (classInfo.headerInfo.isExpression || !classInfo.headerInfo.className) && | ||||
|       classInfo.staticInitializerNames.length + classInfo.instanceInitializerNames.length > 0; | ||||
| 
 | ||||
|     let className = classInfo.headerInfo.className; | ||||
|     if (needsCommaExpression) { | ||||
|       className = this.nameManager.claimFreeName("_class"); | ||||
|       this.generatedVariables.push(className); | ||||
|       this.tokens.appendCode(` (${className} =`); | ||||
|     } | ||||
| 
 | ||||
|     const classToken = this.tokens.currentToken(); | ||||
|     const contextId = classToken.contextId; | ||||
|     if (contextId == null) { | ||||
|       throw new Error("Expected class to have a context ID."); | ||||
|     } | ||||
|     this.tokens.copyExpectedToken(tt._class); | ||||
|     while (!this.tokens.matchesContextIdAndLabel(tt.braceL, contextId)) { | ||||
|       this.processToken(); | ||||
|     } | ||||
| 
 | ||||
|     this.processClassBody(classInfo, className); | ||||
| 
 | ||||
|     const staticInitializerStatements = classInfo.staticInitializerNames.map( | ||||
|       (name) => `${className}.${name}()`, | ||||
|     ); | ||||
|     if (needsCommaExpression) { | ||||
|       this.tokens.appendCode( | ||||
|         `, ${staticInitializerStatements.map((s) => `${s}, `).join("")}${className})`, | ||||
|       ); | ||||
|     } else if (classInfo.staticInitializerNames.length > 0) { | ||||
|       this.tokens.appendCode(` ${staticInitializerStatements.map((s) => `${s};`).join(" ")}`); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * We want to just handle class fields in all contexts, since TypeScript supports them. Later, | ||||
|    * when some JS implementations support class fields, this should be made optional. | ||||
|    */ | ||||
|   processClassBody(classInfo, className) { | ||||
|     const { | ||||
|       headerInfo, | ||||
|       constructorInsertPos, | ||||
|       constructorInitializerStatements, | ||||
|       fields, | ||||
|       instanceInitializerNames, | ||||
|       rangesToRemove, | ||||
|     } = classInfo; | ||||
|     let fieldIndex = 0; | ||||
|     let rangeToRemoveIndex = 0; | ||||
|     const classContextId = this.tokens.currentToken().contextId; | ||||
|     if (classContextId == null) { | ||||
|       throw new Error("Expected non-null context ID on class."); | ||||
|     } | ||||
|     this.tokens.copyExpectedToken(tt.braceL); | ||||
|     if (this.isReactHotLoaderTransformEnabled) { | ||||
|       this.tokens.appendCode( | ||||
|         "__reactstandin__regenerateByEval(key, code) {this[key] = eval(code);}", | ||||
|       ); | ||||
|     } | ||||
| 
 | ||||
|     const needsConstructorInit = | ||||
|       constructorInitializerStatements.length + instanceInitializerNames.length > 0; | ||||
| 
 | ||||
|     if (constructorInsertPos === null && needsConstructorInit) { | ||||
|       const constructorInitializersCode = this.makeConstructorInitCode( | ||||
|         constructorInitializerStatements, | ||||
|         instanceInitializerNames, | ||||
|         className, | ||||
|       ); | ||||
|       if (headerInfo.hasSuperclass) { | ||||
|         const argsName = this.nameManager.claimFreeName("args"); | ||||
|         this.tokens.appendCode( | ||||
|           `constructor(...${argsName}) { super(...${argsName}); ${constructorInitializersCode}; }`, | ||||
|         ); | ||||
|       } else { | ||||
|         this.tokens.appendCode(`constructor() { ${constructorInitializersCode}; }`); | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     while (!this.tokens.matchesContextIdAndLabel(tt.braceR, classContextId)) { | ||||
|       if (fieldIndex < fields.length && this.tokens.currentIndex() === fields[fieldIndex].start) { | ||||
|         let needsCloseBrace = false; | ||||
|         if (this.tokens.matches1(tt.bracketL)) { | ||||
|           this.tokens.copyTokenWithPrefix(`${fields[fieldIndex].initializerName}() {this`); | ||||
|         } else if (this.tokens.matches1(tt.string) || this.tokens.matches1(tt.num)) { | ||||
|           this.tokens.copyTokenWithPrefix(`${fields[fieldIndex].initializerName}() {this[`); | ||||
|           needsCloseBrace = true; | ||||
|         } else { | ||||
|           this.tokens.copyTokenWithPrefix(`${fields[fieldIndex].initializerName}() {this.`); | ||||
|         } | ||||
|         while (this.tokens.currentIndex() < fields[fieldIndex].end) { | ||||
|           if (needsCloseBrace && this.tokens.currentIndex() === fields[fieldIndex].equalsIndex) { | ||||
|             this.tokens.appendCode("]"); | ||||
|           } | ||||
|           this.processToken(); | ||||
|         } | ||||
|         this.tokens.appendCode("}"); | ||||
|         fieldIndex++; | ||||
|       } else if ( | ||||
|         rangeToRemoveIndex < rangesToRemove.length && | ||||
|         this.tokens.currentIndex() >= rangesToRemove[rangeToRemoveIndex].start | ||||
|       ) { | ||||
|         if (this.tokens.currentIndex() < rangesToRemove[rangeToRemoveIndex].end) { | ||||
|           this.tokens.removeInitialToken(); | ||||
|         } | ||||
|         while (this.tokens.currentIndex() < rangesToRemove[rangeToRemoveIndex].end) { | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|         rangeToRemoveIndex++; | ||||
|       } else if (this.tokens.currentIndex() === constructorInsertPos) { | ||||
|         this.tokens.copyToken(); | ||||
|         if (needsConstructorInit) { | ||||
|           this.tokens.appendCode( | ||||
|             `;${this.makeConstructorInitCode( | ||||
|               constructorInitializerStatements, | ||||
|               instanceInitializerNames, | ||||
|               className, | ||||
|             )};`,
 | ||||
|           ); | ||||
|         } | ||||
|         this.processToken(); | ||||
|       } else { | ||||
|         this.processToken(); | ||||
|       } | ||||
|     } | ||||
|     this.tokens.copyExpectedToken(tt.braceR); | ||||
|   } | ||||
| 
 | ||||
|   makeConstructorInitCode( | ||||
|     constructorInitializerStatements, | ||||
|     instanceInitializerNames, | ||||
|     className, | ||||
|   ) { | ||||
|     return [ | ||||
|       ...constructorInitializerStatements, | ||||
|       ...instanceInitializerNames.map((name) => `${className}.prototype.${name}.call(this)`), | ||||
|     ].join(";"); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Normally it's ok to simply remove type tokens, but we need to be more careful when dealing with | ||||
|    * arrow function return types since they can confuse the parser. In that case, we want to move | ||||
|    * the close-paren to the same line as the arrow. | ||||
|    * | ||||
|    * See https://github.com/alangpierce/sucrase/issues/391 for more details.
 | ||||
|    */ | ||||
|   processPossibleArrowParamEnd() { | ||||
|     if (this.tokens.matches2(tt.parenR, tt.colon) && this.tokens.tokenAtRelativeIndex(1).isType) { | ||||
|       let nextNonTypeIndex = this.tokens.currentIndex() + 1; | ||||
|       // Look ahead to see if this is an arrow function or something else.
 | ||||
|       while (this.tokens.tokens[nextNonTypeIndex].isType) { | ||||
|         nextNonTypeIndex++; | ||||
|       } | ||||
|       if (this.tokens.matches1AtIndex(nextNonTypeIndex, tt.arrow)) { | ||||
|         this.tokens.removeInitialToken(); | ||||
|         while (this.tokens.currentIndex() < nextNonTypeIndex) { | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|         this.tokens.replaceTokenTrimmingLeftWhitespace(") =>"); | ||||
|         return true; | ||||
|       } | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * An async arrow function might be of the form: | ||||
|    * | ||||
|    * async < | ||||
|    *   T | ||||
|    * >() => {} | ||||
|    * | ||||
|    * in which case, removing the type parameters will cause a syntax error. Detect this case and | ||||
|    * move the open-paren earlier. | ||||
|    */ | ||||
|   processPossibleAsyncArrowWithTypeParams() { | ||||
|     if ( | ||||
|       !this.tokens.matchesContextual(ContextualKeyword._async) && | ||||
|       !this.tokens.matches1(tt._async) | ||||
|     ) { | ||||
|       return false; | ||||
|     } | ||||
|     const nextToken = this.tokens.tokenAtRelativeIndex(1); | ||||
|     if (nextToken.type !== tt.lessThan || !nextToken.isType) { | ||||
|       return false; | ||||
|     } | ||||
| 
 | ||||
|     let nextNonTypeIndex = this.tokens.currentIndex() + 1; | ||||
|     // Look ahead to see if this is an arrow function or something else.
 | ||||
|     while (this.tokens.tokens[nextNonTypeIndex].isType) { | ||||
|       nextNonTypeIndex++; | ||||
|     } | ||||
|     if (this.tokens.matches1AtIndex(nextNonTypeIndex, tt.parenL)) { | ||||
|       this.tokens.replaceToken("async ("); | ||||
|       this.tokens.removeInitialToken(); | ||||
|       while (this.tokens.currentIndex() < nextNonTypeIndex) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|       this.tokens.removeToken(); | ||||
|       // We ate a ( token, so we need to process the tokens in between and then the ) token so that
 | ||||
|       // we remain balanced.
 | ||||
|       this.processBalancedCode(); | ||||
|       this.processToken(); | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   processPossibleTypeRange() { | ||||
|     if (this.tokens.currentToken().isType) { | ||||
|       this.tokens.removeInitialToken(); | ||||
|       while (this.tokens.currentToken().isType) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   shiftMappings( | ||||
|     mappings, | ||||
|     prefixLength, | ||||
|   ) { | ||||
|     for (let i = 0; i < mappings.length; i++) { | ||||
|       const mapping = mappings[i]; | ||||
|       if (mapping !== undefined) { | ||||
|         mappings[i] = mapping + prefixLength; | ||||
|       } | ||||
|     } | ||||
|     return mappings; | ||||
|   } | ||||
| } | ||||
							
								
								
									
										16
									
								
								node_modules/sucrase/dist/esm/transformers/Transformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										16
									
								
								node_modules/sucrase/dist/esm/transformers/Transformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,16 @@ | |||
| export default  class Transformer { | ||||
|   // Return true if anything was processed, false otherwise.
 | ||||
|    | ||||
| 
 | ||||
|   getPrefixCode() { | ||||
|     return ""; | ||||
|   } | ||||
| 
 | ||||
|   getHoistedCode() { | ||||
|     return ""; | ||||
|   } | ||||
| 
 | ||||
|   getSuffixCode() { | ||||
|     return ""; | ||||
|   } | ||||
| } | ||||
							
								
								
									
										279
									
								
								node_modules/sucrase/dist/esm/transformers/TypeScriptTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										279
									
								
								node_modules/sucrase/dist/esm/transformers/TypeScriptTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,279 @@ | |||
| 
 | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| import isIdentifier from "../util/isIdentifier"; | ||||
| 
 | ||||
| import Transformer from "./Transformer"; | ||||
| 
 | ||||
| export default class TypeScriptTransformer extends Transformer { | ||||
|   constructor( | ||||
|      rootTransformer, | ||||
|      tokens, | ||||
|      isImportsTransformEnabled, | ||||
|   ) { | ||||
|     super();this.rootTransformer = rootTransformer;this.tokens = tokens;this.isImportsTransformEnabled = isImportsTransformEnabled;; | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     if ( | ||||
|       this.rootTransformer.processPossibleArrowParamEnd() || | ||||
|       this.rootTransformer.processPossibleAsyncArrowWithTypeParams() || | ||||
|       this.rootTransformer.processPossibleTypeRange() | ||||
|     ) { | ||||
|       return true; | ||||
|     } | ||||
|     if ( | ||||
|       this.tokens.matches1(tt._public) || | ||||
|       this.tokens.matches1(tt._protected) || | ||||
|       this.tokens.matches1(tt._private) || | ||||
|       this.tokens.matches1(tt._abstract) || | ||||
|       this.tokens.matches1(tt._readonly) || | ||||
|       this.tokens.matches1(tt._override) || | ||||
|       this.tokens.matches1(tt.nonNullAssertion) | ||||
|     ) { | ||||
|       this.tokens.removeInitialToken(); | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches1(tt._enum) || this.tokens.matches2(tt._const, tt._enum)) { | ||||
|       this.processEnum(); | ||||
|       return true; | ||||
|     } | ||||
|     if ( | ||||
|       this.tokens.matches2(tt._export, tt._enum) || | ||||
|       this.tokens.matches3(tt._export, tt._const, tt._enum) | ||||
|     ) { | ||||
|       this.processEnum(true); | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   processEnum(isExport = false) { | ||||
|     // We might have "export const enum", so just remove all relevant tokens.
 | ||||
|     this.tokens.removeInitialToken(); | ||||
|     while (this.tokens.matches1(tt._const) || this.tokens.matches1(tt._enum)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|     const enumName = this.tokens.identifierName(); | ||||
|     this.tokens.removeToken(); | ||||
|     if (isExport && !this.isImportsTransformEnabled) { | ||||
|       this.tokens.appendCode("export "); | ||||
|     } | ||||
|     this.tokens.appendCode(`var ${enumName}; (function (${enumName})`); | ||||
|     this.tokens.copyExpectedToken(tt.braceL); | ||||
|     this.processEnumBody(enumName); | ||||
|     this.tokens.copyExpectedToken(tt.braceR); | ||||
|     if (isExport && this.isImportsTransformEnabled) { | ||||
|       this.tokens.appendCode(`)(${enumName} || (exports.${enumName} = ${enumName} = {}));`); | ||||
|     } else { | ||||
|       this.tokens.appendCode(`)(${enumName} || (${enumName} = {}));`); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform an enum into equivalent JS. This has complexity in a few places: | ||||
|    * - TS allows string enums, numeric enums, and a mix of the two styles within an enum. | ||||
|    * - Enum keys are allowed to be referenced in later enum values. | ||||
|    * - Enum keys are allowed to be strings. | ||||
|    * - When enum values are omitted, they should follow an auto-increment behavior. | ||||
|    */ | ||||
|   processEnumBody(enumName) { | ||||
|     // Code that can be used to reference the previous enum member, or null if this is the first
 | ||||
|     // enum member.
 | ||||
|     let previousValueCode = null; | ||||
|     while (true) { | ||||
|       if (this.tokens.matches1(tt.braceR)) { | ||||
|         break; | ||||
|       } | ||||
|       const {nameStringCode, variableName} = this.extractEnumKeyInfo(this.tokens.currentToken()); | ||||
|       this.tokens.removeInitialToken(); | ||||
| 
 | ||||
|       if ( | ||||
|         this.tokens.matches3(tt.eq, tt.string, tt.comma) || | ||||
|         this.tokens.matches3(tt.eq, tt.string, tt.braceR) | ||||
|       ) { | ||||
|         this.processStringLiteralEnumMember(enumName, nameStringCode, variableName); | ||||
|       } else if (this.tokens.matches1(tt.eq)) { | ||||
|         this.processExplicitValueEnumMember(enumName, nameStringCode, variableName); | ||||
|       } else { | ||||
|         this.processImplicitValueEnumMember( | ||||
|           enumName, | ||||
|           nameStringCode, | ||||
|           variableName, | ||||
|           previousValueCode, | ||||
|         ); | ||||
|       } | ||||
|       if (this.tokens.matches1(tt.comma)) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
| 
 | ||||
|       if (variableName != null) { | ||||
|         previousValueCode = variableName; | ||||
|       } else { | ||||
|         previousValueCode = `${enumName}[${nameStringCode}]`; | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Detect name information about this enum key, which will be used to determine which code to emit | ||||
|    * and whether we should declare a variable as part of this declaration. | ||||
|    * | ||||
|    * Some cases to keep in mind: | ||||
|    * - Enum keys can be implicitly referenced later, e.g. `X = 1, Y = X`. In Sucrase, we implement | ||||
|    *   this by declaring a variable `X` so that later expressions can use it. | ||||
|    * - In addition to the usual identifier key syntax, enum keys are allowed to be string literals, | ||||
|    *   e.g. `"hello world" = 3,`. Template literal syntax is NOT allowed. | ||||
|    * - Even if the enum key is defined as a string literal, it may still be referenced by identifier | ||||
|    *   later, e.g. `"X" = 1, Y = X`. That means that we need to detect whether or not a string | ||||
|    *   literal is identifier-like and emit a variable if so, even if the declaration did not use an | ||||
|    *   identifier. | ||||
|    * - Reserved keywords like `break` are valid enum keys, but are not valid to be referenced later | ||||
|    *   and would be a syntax error if we emitted a variable, so we need to skip the variable | ||||
|    *   declaration in those cases. | ||||
|    * | ||||
|    * The variableName return value captures these nuances: if non-null, we can and must emit a | ||||
|    * variable declaration, and if null, we can't and shouldn't. | ||||
|    */ | ||||
|   extractEnumKeyInfo(nameToken) { | ||||
|     if (nameToken.type === tt.name) { | ||||
|       const name = this.tokens.identifierNameForToken(nameToken); | ||||
|       return { | ||||
|         nameStringCode: `"${name}"`, | ||||
|         variableName: isIdentifier(name) ? name : null, | ||||
|       }; | ||||
|     } else if (nameToken.type === tt.string) { | ||||
|       const name = this.tokens.stringValueForToken(nameToken); | ||||
|       return { | ||||
|         nameStringCode: this.tokens.code.slice(nameToken.start, nameToken.end), | ||||
|         variableName: isIdentifier(name) ? name : null, | ||||
|       }; | ||||
|     } else { | ||||
|       throw new Error("Expected name or string at beginning of enum element."); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Handle an enum member where the RHS is just a string literal (not omitted, not a number, and | ||||
|    * not a complex expression). This is the typical form for TS string enums, and in this case, we | ||||
|    * do *not* create a reverse mapping. | ||||
|    * | ||||
|    * This is called after deleting the key token, when the token processor is at the equals sign. | ||||
|    * | ||||
|    * Example 1: | ||||
|    * someKey = "some value" | ||||
|    * -> | ||||
|    * const someKey = "some value"; MyEnum["someKey"] = someKey; | ||||
|    * | ||||
|    * Example 2: | ||||
|    * "some key" = "some value" | ||||
|    * -> | ||||
|    * MyEnum["some key"] = "some value"; | ||||
|    */ | ||||
|   processStringLiteralEnumMember( | ||||
|     enumName, | ||||
|     nameStringCode, | ||||
|     variableName, | ||||
|   ) { | ||||
|     if (variableName != null) { | ||||
|       this.tokens.appendCode(`const ${variableName}`); | ||||
|       // =
 | ||||
|       this.tokens.copyToken(); | ||||
|       // value string
 | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.appendCode(`; ${enumName}[${nameStringCode}] = ${variableName};`); | ||||
|     } else { | ||||
|       this.tokens.appendCode(`${enumName}[${nameStringCode}]`); | ||||
|       // =
 | ||||
|       this.tokens.copyToken(); | ||||
|       // value string
 | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.appendCode(";"); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Handle an enum member initialized with an expression on the right-hand side (other than a | ||||
|    * string literal). In these cases, we should transform the expression and emit code that sets up | ||||
|    * a reverse mapping. | ||||
|    * | ||||
|    * The TypeScript implementation of this operation distinguishes between expressions that can be | ||||
|    * "constant folded" at compile time (i.e. consist of number literals and simple math operations | ||||
|    * on those numbers) and ones that are dynamic. For constant expressions, it emits the resolved | ||||
|    * numeric value, and auto-incrementing is only allowed in that case. Evaluating expressions at | ||||
|    * compile time would add significant complexity to Sucrase, so Sucrase instead leaves the | ||||
|    * expression as-is, and will later emit something like `MyEnum["previousKey"] + 1` to implement | ||||
|    * auto-incrementing. | ||||
|    * | ||||
|    * This is called after deleting the key token, when the token processor is at the equals sign. | ||||
|    * | ||||
|    * Example 1: | ||||
|    * someKey = 1 + 1 | ||||
|    * -> | ||||
|    * const someKey = 1 + 1; MyEnum[MyEnum["someKey"] = someKey] = "someKey"; | ||||
|    * | ||||
|    * Example 2: | ||||
|    * "some key" = 1 + 1 | ||||
|    * -> | ||||
|    * MyEnum[MyEnum["some key"] = 1 + 1] = "some key"; | ||||
|    */ | ||||
|   processExplicitValueEnumMember( | ||||
|     enumName, | ||||
|     nameStringCode, | ||||
|     variableName, | ||||
|   ) { | ||||
|     const rhsEndIndex = this.tokens.currentToken().rhsEndIndex; | ||||
|     if (rhsEndIndex == null) { | ||||
|       throw new Error("Expected rhsEndIndex on enum assign."); | ||||
|     } | ||||
| 
 | ||||
|     if (variableName != null) { | ||||
|       this.tokens.appendCode(`const ${variableName}`); | ||||
|       this.tokens.copyToken(); | ||||
|       while (this.tokens.currentIndex() < rhsEndIndex) { | ||||
|         this.rootTransformer.processToken(); | ||||
|       } | ||||
|       this.tokens.appendCode( | ||||
|         `; ${enumName}[${enumName}[${nameStringCode}] = ${variableName}] = ${nameStringCode};`, | ||||
|       ); | ||||
|     } else { | ||||
|       this.tokens.appendCode(`${enumName}[${enumName}[${nameStringCode}]`); | ||||
|       this.tokens.copyToken(); | ||||
|       while (this.tokens.currentIndex() < rhsEndIndex) { | ||||
|         this.rootTransformer.processToken(); | ||||
|       } | ||||
|       this.tokens.appendCode(`] = ${nameStringCode};`); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Handle an enum member with no right-hand side expression. In this case, the value is the | ||||
|    * previous value plus 1, or 0 if there was no previous value. We should also always emit a | ||||
|    * reverse mapping. | ||||
|    * | ||||
|    * Example 1: | ||||
|    * someKey2 | ||||
|    * -> | ||||
|    * const someKey2 = someKey1 + 1; MyEnum[MyEnum["someKey2"] = someKey2] = "someKey2"; | ||||
|    * | ||||
|    * Example 2: | ||||
|    * "some key 2" | ||||
|    * -> | ||||
|    * MyEnum[MyEnum["some key 2"] = someKey1 + 1] = "some key 2"; | ||||
|    */ | ||||
|   processImplicitValueEnumMember( | ||||
|     enumName, | ||||
|     nameStringCode, | ||||
|     variableName, | ||||
|     previousValueCode, | ||||
|   ) { | ||||
|     let valueCode = previousValueCode != null ? `${previousValueCode} + 1` : "0"; | ||||
|     if (variableName != null) { | ||||
|       this.tokens.appendCode(`const ${variableName} = ${valueCode}; `); | ||||
|       valueCode = variableName; | ||||
|     } | ||||
|     this.tokens.appendCode( | ||||
|       `${enumName}[${enumName}[${nameStringCode}] = ${valueCode}] = ${nameStringCode};`, | ||||
|     ); | ||||
|   } | ||||
| } | ||||
							
								
								
									
										29
									
								
								node_modules/sucrase/dist/esm/util/elideImportEquals.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										29
									
								
								node_modules/sucrase/dist/esm/util/elideImportEquals.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,29 @@ | |||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| 
 | ||||
| export default function elideImportEquals(tokens) { | ||||
|   // import
 | ||||
|   tokens.removeInitialToken(); | ||||
|   // name
 | ||||
|   tokens.removeToken(); | ||||
|   // =
 | ||||
|   tokens.removeToken(); | ||||
|   // name or require
 | ||||
|   tokens.removeToken(); | ||||
|   // Handle either `import A = require('A')` or `import A = B.C.D`.
 | ||||
|   if (tokens.matches1(tt.parenL)) { | ||||
|     // (
 | ||||
|     tokens.removeToken(); | ||||
|     // path string
 | ||||
|     tokens.removeToken(); | ||||
|     // )
 | ||||
|     tokens.removeToken(); | ||||
|   } else { | ||||
|     while (tokens.matches1(tt.dot)) { | ||||
|       // .
 | ||||
|       tokens.removeToken(); | ||||
|       // name
 | ||||
|       tokens.removeToken(); | ||||
|     } | ||||
|   } | ||||
| } | ||||
							
								
								
									
										74
									
								
								node_modules/sucrase/dist/esm/util/formatTokens.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										74
									
								
								node_modules/sucrase/dist/esm/util/formatTokens.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,74 @@ | |||
| import LinesAndColumns from "lines-and-columns"; | ||||
| 
 | ||||
| 
 | ||||
| import {formatTokenType} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| export default function formatTokens(code, tokens) { | ||||
|   if (tokens.length === 0) { | ||||
|     return ""; | ||||
|   } | ||||
| 
 | ||||
|   const tokenKeys = Object.keys(tokens[0]).filter( | ||||
|     (k) => k !== "type" && k !== "value" && k !== "start" && k !== "end" && k !== "loc", | ||||
|   ); | ||||
|   const typeKeys = Object.keys(tokens[0].type).filter((k) => k !== "label" && k !== "keyword"); | ||||
| 
 | ||||
|   const headings = ["Location", "Label", "Raw", ...tokenKeys, ...typeKeys]; | ||||
| 
 | ||||
|   const lines = new LinesAndColumns(code); | ||||
|   const rows = [headings, ...tokens.map(getTokenComponents)]; | ||||
|   const padding = headings.map(() => 0); | ||||
|   for (const components of rows) { | ||||
|     for (let i = 0; i < components.length; i++) { | ||||
|       padding[i] = Math.max(padding[i], components[i].length); | ||||
|     } | ||||
|   } | ||||
|   return rows | ||||
|     .map((components) => components.map((component, i) => component.padEnd(padding[i])).join(" ")) | ||||
|     .join("\n"); | ||||
| 
 | ||||
|   function getTokenComponents(token) { | ||||
|     const raw = code.slice(token.start, token.end); | ||||
|     return [ | ||||
|       formatRange(token.start, token.end), | ||||
|       formatTokenType(token.type), | ||||
|       truncate(String(raw), 14), | ||||
|       // @ts-ignore: Intentional dynamic access by key.
 | ||||
|       ...tokenKeys.map((key) => formatValue(token[key], key)), | ||||
|       // @ts-ignore: Intentional dynamic access by key.
 | ||||
|       ...typeKeys.map((key) => formatValue(token.type[key], key)), | ||||
|     ]; | ||||
|   } | ||||
| 
 | ||||
|   // eslint-disable-next-line @typescript-eslint/no-explicit-any
 | ||||
|   function formatValue(value, key) { | ||||
|     if (value === true) { | ||||
|       return key; | ||||
|     } else if (value === false || value === null) { | ||||
|       return ""; | ||||
|     } else { | ||||
|       return String(value); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   function formatRange(start, end) { | ||||
|     return `${formatPos(start)}-${formatPos(end)}`; | ||||
|   } | ||||
| 
 | ||||
|   function formatPos(pos) { | ||||
|     const location = lines.locationForIndex(pos); | ||||
|     if (!location) { | ||||
|       return "Unknown"; | ||||
|     } else { | ||||
|       return `${location.line + 1}:${location.column + 1}`; | ||||
|     } | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| function truncate(s, length) { | ||||
|   if (s.length > length) { | ||||
|     return `${s.slice(0, length - 3)}...`; | ||||
|   } else { | ||||
|     return s; | ||||
|   } | ||||
| } | ||||
							
								
								
									
										352
									
								
								node_modules/sucrase/dist/esm/util/getClassInfo.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										352
									
								
								node_modules/sucrase/dist/esm/util/getClassInfo.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,352 @@ | |||
| 
 | ||||
| 
 | ||||
| import {ContextualKeyword} from "../parser/tokenizer/keywords"; | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| /** | ||||
|  * Get information about the class fields for this class, given a token processor pointing to the | ||||
|  * open-brace at the start of the class. | ||||
|  */ | ||||
| export default function getClassInfo( | ||||
|   rootTransformer, | ||||
|   tokens, | ||||
|   nameManager, | ||||
|   disableESTransforms, | ||||
| ) { | ||||
|   const snapshot = tokens.snapshot(); | ||||
| 
 | ||||
|   const headerInfo = processClassHeader(tokens); | ||||
| 
 | ||||
|   let constructorInitializerStatements = []; | ||||
|   const instanceInitializerNames = []; | ||||
|   const staticInitializerNames = []; | ||||
|   let constructorInsertPos = null; | ||||
|   const fields = []; | ||||
|   const rangesToRemove = []; | ||||
| 
 | ||||
|   const classContextId = tokens.currentToken().contextId; | ||||
|   if (classContextId == null) { | ||||
|     throw new Error("Expected non-null class context ID on class open-brace."); | ||||
|   } | ||||
| 
 | ||||
|   tokens.nextToken(); | ||||
|   while (!tokens.matchesContextIdAndLabel(tt.braceR, classContextId)) { | ||||
|     if (tokens.matchesContextual(ContextualKeyword._constructor) && !tokens.currentToken().isType) { | ||||
|       ({constructorInitializerStatements, constructorInsertPos} = processConstructor(tokens)); | ||||
|     } else if (tokens.matches1(tt.semi)) { | ||||
|       if (!disableESTransforms) { | ||||
|         rangesToRemove.push({start: tokens.currentIndex(), end: tokens.currentIndex() + 1}); | ||||
|       } | ||||
|       tokens.nextToken(); | ||||
|     } else if (tokens.currentToken().isType) { | ||||
|       tokens.nextToken(); | ||||
|     } else { | ||||
|       // Either a method or a field. Skip to the identifier part.
 | ||||
|       const statementStartIndex = tokens.currentIndex(); | ||||
|       let isStatic = false; | ||||
|       let isESPrivate = false; | ||||
|       let isDeclareOrAbstract = false; | ||||
|       while (isAccessModifier(tokens.currentToken())) { | ||||
|         if (tokens.matches1(tt._static)) { | ||||
|           isStatic = true; | ||||
|         } | ||||
|         if (tokens.matches1(tt.hash)) { | ||||
|           isESPrivate = true; | ||||
|         } | ||||
|         if (tokens.matches1(tt._declare) || tokens.matches1(tt._abstract)) { | ||||
|           isDeclareOrAbstract = true; | ||||
|         } | ||||
|         tokens.nextToken(); | ||||
|       } | ||||
|       if (isStatic && tokens.matches1(tt.braceL)) { | ||||
|         // This is a static block, so don't process it in any special way.
 | ||||
|         skipToNextClassElement(tokens, classContextId); | ||||
|         continue; | ||||
|       } | ||||
|       if (isESPrivate) { | ||||
|         // Sucrase doesn't attempt to transpile private fields; just leave them as-is.
 | ||||
|         skipToNextClassElement(tokens, classContextId); | ||||
|         continue; | ||||
|       } | ||||
|       if ( | ||||
|         tokens.matchesContextual(ContextualKeyword._constructor) && | ||||
|         !tokens.currentToken().isType | ||||
|       ) { | ||||
|         ({constructorInitializerStatements, constructorInsertPos} = processConstructor(tokens)); | ||||
|         continue; | ||||
|       } | ||||
| 
 | ||||
|       const nameStartIndex = tokens.currentIndex(); | ||||
|       skipFieldName(tokens); | ||||
|       if (tokens.matches1(tt.lessThan) || tokens.matches1(tt.parenL)) { | ||||
|         // This is a method, so nothing to process.
 | ||||
|         skipToNextClassElement(tokens, classContextId); | ||||
|         continue; | ||||
|       } | ||||
|       // There might be a type annotation that we need to skip.
 | ||||
|       while (tokens.currentToken().isType) { | ||||
|         tokens.nextToken(); | ||||
|       } | ||||
|       if (tokens.matches1(tt.eq)) { | ||||
|         const equalsIndex = tokens.currentIndex(); | ||||
|         // This is an initializer, so we need to wrap in an initializer method.
 | ||||
|         const valueEnd = tokens.currentToken().rhsEndIndex; | ||||
|         if (valueEnd == null) { | ||||
|           throw new Error("Expected rhsEndIndex on class field assignment."); | ||||
|         } | ||||
|         tokens.nextToken(); | ||||
|         while (tokens.currentIndex() < valueEnd) { | ||||
|           rootTransformer.processToken(); | ||||
|         } | ||||
|         let initializerName; | ||||
|         if (isStatic) { | ||||
|           initializerName = nameManager.claimFreeName("__initStatic"); | ||||
|           staticInitializerNames.push(initializerName); | ||||
|         } else { | ||||
|           initializerName = nameManager.claimFreeName("__init"); | ||||
|           instanceInitializerNames.push(initializerName); | ||||
|         } | ||||
|         // Fields start at the name, so `static x = 1;` has a field range of `x = 1;`.
 | ||||
|         fields.push({ | ||||
|           initializerName, | ||||
|           equalsIndex, | ||||
|           start: nameStartIndex, | ||||
|           end: tokens.currentIndex(), | ||||
|         }); | ||||
|       } else if (!disableESTransforms || isDeclareOrAbstract) { | ||||
|         // This is a regular field declaration, like `x;`. With the class transform enabled, we just
 | ||||
|         // remove the line so that no output is produced. With the class transform disabled, we
 | ||||
|         // usually want to preserve the declaration (but still strip types), but if the `declare`
 | ||||
|         // or `abstract` keyword is specified, we should remove the line to avoid initializing the
 | ||||
|         // value to undefined.
 | ||||
|         rangesToRemove.push({start: statementStartIndex, end: tokens.currentIndex()}); | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   tokens.restoreToSnapshot(snapshot); | ||||
|   if (disableESTransforms) { | ||||
|     // With ES transforms disabled, we don't want to transform regular class
 | ||||
|     // field declarations, and we don't need to do any additional tricks to
 | ||||
|     // reference the constructor for static init, but we still need to transform
 | ||||
|     // TypeScript field initializers defined as constructor parameters and we
 | ||||
|     // still need to remove `declare` fields. For now, we run the same code
 | ||||
|     // path but omit any field information, as if the class had no field
 | ||||
|     // declarations. In the future, when we fully drop the class fields
 | ||||
|     // transform, we can simplify this code significantly.
 | ||||
|     return { | ||||
|       headerInfo, | ||||
|       constructorInitializerStatements, | ||||
|       instanceInitializerNames: [], | ||||
|       staticInitializerNames: [], | ||||
|       constructorInsertPos, | ||||
|       fields: [], | ||||
|       rangesToRemove, | ||||
|     }; | ||||
|   } else { | ||||
|     return { | ||||
|       headerInfo, | ||||
|       constructorInitializerStatements, | ||||
|       instanceInitializerNames, | ||||
|       staticInitializerNames, | ||||
|       constructorInsertPos, | ||||
|       fields, | ||||
|       rangesToRemove, | ||||
|     }; | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Move the token processor to the next method/field in the class. | ||||
|  * | ||||
|  * To do that, we seek forward to the next start of a class name (either an open | ||||
|  * bracket or an identifier, or the closing curly brace), then seek backward to | ||||
|  * include any access modifiers. | ||||
|  */ | ||||
| function skipToNextClassElement(tokens, classContextId) { | ||||
|   tokens.nextToken(); | ||||
|   while (tokens.currentToken().contextId !== classContextId) { | ||||
|     tokens.nextToken(); | ||||
|   } | ||||
|   while (isAccessModifier(tokens.tokenAtRelativeIndex(-1))) { | ||||
|     tokens.previousToken(); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| function processClassHeader(tokens) { | ||||
|   const classToken = tokens.currentToken(); | ||||
|   const contextId = classToken.contextId; | ||||
|   if (contextId == null) { | ||||
|     throw new Error("Expected context ID on class token."); | ||||
|   } | ||||
|   const isExpression = classToken.isExpression; | ||||
|   if (isExpression == null) { | ||||
|     throw new Error("Expected isExpression on class token."); | ||||
|   } | ||||
|   let className = null; | ||||
|   let hasSuperclass = false; | ||||
|   tokens.nextToken(); | ||||
|   if (tokens.matches1(tt.name)) { | ||||
|     className = tokens.identifierName(); | ||||
|   } | ||||
|   while (!tokens.matchesContextIdAndLabel(tt.braceL, contextId)) { | ||||
|     // If this has a superclass, there will always be an `extends` token. If it doesn't have a
 | ||||
|     // superclass, only type parameters and `implements` clauses can show up here, all of which
 | ||||
|     // consist only of type tokens. A declaration like `class A<B extends C> {` should *not* count
 | ||||
|     // as having a superclass.
 | ||||
|     if (tokens.matches1(tt._extends) && !tokens.currentToken().isType) { | ||||
|       hasSuperclass = true; | ||||
|     } | ||||
|     tokens.nextToken(); | ||||
|   } | ||||
|   return {isExpression, className, hasSuperclass}; | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Extract useful information out of a constructor, starting at the "constructor" name. | ||||
|  */ | ||||
| function processConstructor(tokens) | ||||
| 
 | ||||
| 
 | ||||
|  { | ||||
|   const constructorInitializerStatements = []; | ||||
| 
 | ||||
|   tokens.nextToken(); | ||||
|   const constructorContextId = tokens.currentToken().contextId; | ||||
|   if (constructorContextId == null) { | ||||
|     throw new Error("Expected context ID on open-paren starting constructor params."); | ||||
|   } | ||||
|   // Advance through parameters looking for access modifiers.
 | ||||
|   while (!tokens.matchesContextIdAndLabel(tt.parenR, constructorContextId)) { | ||||
|     if (tokens.currentToken().contextId === constructorContextId) { | ||||
|       // Current token is an open paren or comma just before a param, so check
 | ||||
|       // that param for access modifiers.
 | ||||
|       tokens.nextToken(); | ||||
|       if (isAccessModifier(tokens.currentToken())) { | ||||
|         tokens.nextToken(); | ||||
|         while (isAccessModifier(tokens.currentToken())) { | ||||
|           tokens.nextToken(); | ||||
|         } | ||||
|         const token = tokens.currentToken(); | ||||
|         if (token.type !== tt.name) { | ||||
|           throw new Error("Expected identifier after access modifiers in constructor arg."); | ||||
|         } | ||||
|         const name = tokens.identifierNameForToken(token); | ||||
|         constructorInitializerStatements.push(`this.${name} = ${name}`); | ||||
|       } | ||||
|     } else { | ||||
|       tokens.nextToken(); | ||||
|     } | ||||
|   } | ||||
|   // )
 | ||||
|   tokens.nextToken(); | ||||
|   // Constructor type annotations are invalid, but skip them anyway since
 | ||||
|   // they're easy to skip.
 | ||||
|   while (tokens.currentToken().isType) { | ||||
|     tokens.nextToken(); | ||||
|   } | ||||
|   let constructorInsertPos = tokens.currentIndex(); | ||||
| 
 | ||||
|   // Advance through body looking for a super call.
 | ||||
|   let foundSuperCall = false; | ||||
|   while (!tokens.matchesContextIdAndLabel(tt.braceR, constructorContextId)) { | ||||
|     if (!foundSuperCall && tokens.matches2(tt._super, tt.parenL)) { | ||||
|       tokens.nextToken(); | ||||
|       const superCallContextId = tokens.currentToken().contextId; | ||||
|       if (superCallContextId == null) { | ||||
|         throw new Error("Expected a context ID on the super call"); | ||||
|       } | ||||
|       while (!tokens.matchesContextIdAndLabel(tt.parenR, superCallContextId)) { | ||||
|         tokens.nextToken(); | ||||
|       } | ||||
|       constructorInsertPos = tokens.currentIndex(); | ||||
|       foundSuperCall = true; | ||||
|     } | ||||
|     tokens.nextToken(); | ||||
|   } | ||||
|   // }
 | ||||
|   tokens.nextToken(); | ||||
| 
 | ||||
|   return {constructorInitializerStatements, constructorInsertPos}; | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Determine if this is any token that can go before the name in a method/field. | ||||
|  */ | ||||
| function isAccessModifier(token) { | ||||
|   return [ | ||||
|     tt._async, | ||||
|     tt._get, | ||||
|     tt._set, | ||||
|     tt.plus, | ||||
|     tt.minus, | ||||
|     tt._readonly, | ||||
|     tt._static, | ||||
|     tt._public, | ||||
|     tt._private, | ||||
|     tt._protected, | ||||
|     tt._override, | ||||
|     tt._abstract, | ||||
|     tt.star, | ||||
|     tt._declare, | ||||
|     tt.hash, | ||||
|   ].includes(token.type); | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * The next token or set of tokens is either an identifier or an expression in square brackets, for | ||||
|  * a method or field name. | ||||
|  */ | ||||
| function skipFieldName(tokens) { | ||||
|   if (tokens.matches1(tt.bracketL)) { | ||||
|     const startToken = tokens.currentToken(); | ||||
|     const classContextId = startToken.contextId; | ||||
|     if (classContextId == null) { | ||||
|       throw new Error("Expected class context ID on computed name open bracket."); | ||||
|     } | ||||
|     while (!tokens.matchesContextIdAndLabel(tt.bracketR, classContextId)) { | ||||
|       tokens.nextToken(); | ||||
|     } | ||||
|     tokens.nextToken(); | ||||
|   } else { | ||||
|     tokens.nextToken(); | ||||
|   } | ||||
| } | ||||
							
								
								
									
										40
									
								
								node_modules/sucrase/dist/esm/util/getDeclarationInfo.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										40
									
								
								node_modules/sucrase/dist/esm/util/getDeclarationInfo.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,40 @@ | |||
| import {isTopLevelDeclaration} from "../parser/tokenizer"; | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| export const EMPTY_DECLARATION_INFO = { | ||||
|   typeDeclarations: new Set(), | ||||
|   valueDeclarations: new Set(), | ||||
| }; | ||||
| 
 | ||||
| /** | ||||
|  * Get all top-level identifiers that should be preserved when exported in TypeScript. | ||||
|  * | ||||
|  * Examples: | ||||
|  * - If an identifier is declared as `const x`, then `export {x}` should be preserved. | ||||
|  * - If it's declared as `type x`, then `export {x}` should be removed. | ||||
|  * - If it's declared as both `const x` and `type x`, then the export should be preserved. | ||||
|  * - Classes and enums should be preserved (even though they also introduce types). | ||||
|  * - Imported identifiers should be preserved since we don't have enough information to | ||||
|  *   rule them out. --isolatedModules disallows re-exports, which catches errors here. | ||||
|  */ | ||||
| export default function getDeclarationInfo(tokens) { | ||||
|   const typeDeclarations = new Set(); | ||||
|   const valueDeclarations = new Set(); | ||||
|   for (let i = 0; i < tokens.tokens.length; i++) { | ||||
|     const token = tokens.tokens[i]; | ||||
|     if (token.type === tt.name && isTopLevelDeclaration(token)) { | ||||
|       if (token.isType) { | ||||
|         typeDeclarations.add(tokens.identifierNameForToken(token)); | ||||
|       } else { | ||||
|         valueDeclarations.add(tokens.identifierNameForToken(token)); | ||||
|       } | ||||
|     } | ||||
|   } | ||||
|   return {typeDeclarations, valueDeclarations}; | ||||
| } | ||||
							
								
								
									
										15
									
								
								node_modules/sucrase/dist/esm/util/getIdentifierNames.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										15
									
								
								node_modules/sucrase/dist/esm/util/getIdentifierNames.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,15 @@ | |||
| 
 | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| /** | ||||
|  * Get all identifier names in the code, in order, including duplicates. | ||||
|  */ | ||||
| export default function getIdentifierNames(code, tokens) { | ||||
|   const names = []; | ||||
|   for (const token of tokens) { | ||||
|     if (token.type === tt.name) { | ||||
|       names.push(code.slice(token.start, token.end)); | ||||
|     } | ||||
|   } | ||||
|   return names; | ||||
| } | ||||
							
								
								
									
										92
									
								
								node_modules/sucrase/dist/esm/util/getImportExportSpecifierInfo.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										92
									
								
								node_modules/sucrase/dist/esm/util/getImportExportSpecifierInfo.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,92 @@ | |||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| 
 | ||||
|   | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| /** | ||||
|  * Determine information about this named import or named export specifier. | ||||
|  * | ||||
|  * This syntax is the `a` from statements like these: | ||||
|  * import {A} from "./foo"; | ||||
|  * export {A}; | ||||
|  * export {A} from "./foo"; | ||||
|  * | ||||
|  * As it turns out, we can exactly characterize the syntax meaning by simply | ||||
|  * counting the number of tokens, which can be from 1 to 4: | ||||
|  * {A} | ||||
|  * {type A} | ||||
|  * {A as B} | ||||
|  * {type A as B} | ||||
|  * | ||||
|  * In the type case, we never actually need the names in practice, so don't get | ||||
|  * them. | ||||
|  * | ||||
|  * TODO: There's some redundancy with the type detection here and the isType | ||||
|  * flag that's already present on tokens in TS mode. This function could | ||||
|  * potentially be simplified and/or pushed to the call sites to avoid the object | ||||
|  * allocation. | ||||
|  */ | ||||
| export default function getImportExportSpecifierInfo( | ||||
|   tokens, | ||||
|   index = tokens.currentIndex(), | ||||
| ) { | ||||
|   let endIndex = index + 1; | ||||
|   if (isSpecifierEnd(tokens, endIndex)) { | ||||
|     // import {A}
 | ||||
|     const name = tokens.identifierNameAtIndex(index); | ||||
|     return { | ||||
|       isType: false, | ||||
|       leftName: name, | ||||
|       rightName: name, | ||||
|       endIndex, | ||||
|     }; | ||||
|   } | ||||
|   endIndex++; | ||||
|   if (isSpecifierEnd(tokens, endIndex)) { | ||||
|     // import {type A}
 | ||||
|     return { | ||||
|       isType: true, | ||||
|       leftName: null, | ||||
|       rightName: null, | ||||
|       endIndex, | ||||
|     }; | ||||
|   } | ||||
|   endIndex++; | ||||
|   if (isSpecifierEnd(tokens, endIndex)) { | ||||
|     // import {A as B}
 | ||||
|     return { | ||||
|       isType: false, | ||||
|       leftName: tokens.identifierNameAtIndex(index), | ||||
|       rightName: tokens.identifierNameAtIndex(index + 2), | ||||
|       endIndex, | ||||
|     }; | ||||
|   } | ||||
|   endIndex++; | ||||
|   if (isSpecifierEnd(tokens, endIndex)) { | ||||
|     // import {type A as B}
 | ||||
|     return { | ||||
|       isType: true, | ||||
|       leftName: null, | ||||
|       rightName: null, | ||||
|       endIndex, | ||||
|     }; | ||||
|   } | ||||
|   throw new Error(`Unexpected import/export specifier at ${index}`); | ||||
| } | ||||
| 
 | ||||
| function isSpecifierEnd(tokens, index) { | ||||
|   const token = tokens.tokens[index]; | ||||
|   return token.type === tt.braceR || token.type === tt.comma; | ||||
| } | ||||
							
								
								
									
										22
									
								
								node_modules/sucrase/dist/esm/util/getJSXPragmaInfo.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										22
									
								
								node_modules/sucrase/dist/esm/util/getJSXPragmaInfo.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,22 @@ | |||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| export default function getJSXPragmaInfo(options) { | ||||
|   const [base, suffix] = splitPragma(options.jsxPragma || "React.createElement"); | ||||
|   const [fragmentBase, fragmentSuffix] = splitPragma(options.jsxFragmentPragma || "React.Fragment"); | ||||
|   return {base, suffix, fragmentBase, fragmentSuffix}; | ||||
| } | ||||
| 
 | ||||
| function splitPragma(pragma) { | ||||
|   let dotIndex = pragma.indexOf("."); | ||||
|   if (dotIndex === -1) { | ||||
|     dotIndex = pragma.length; | ||||
|   } | ||||
|   return [pragma.slice(0, dotIndex), pragma.slice(dotIndex)]; | ||||
| } | ||||
							
								
								
									
										43
									
								
								node_modules/sucrase/dist/esm/util/getNonTypeIdentifiers.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										43
									
								
								node_modules/sucrase/dist/esm/util/getNonTypeIdentifiers.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,43 @@ | |||
| 
 | ||||
| import {IdentifierRole} from "../parser/tokenizer"; | ||||
| import {TokenType, TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| import {startsWithLowerCase} from "../transformers/JSXTransformer"; | ||||
| import getJSXPragmaInfo from "./getJSXPragmaInfo"; | ||||
| 
 | ||||
| export function getNonTypeIdentifiers(tokens, options) { | ||||
|   const jsxPragmaInfo = getJSXPragmaInfo(options); | ||||
|   const nonTypeIdentifiers = new Set(); | ||||
|   for (let i = 0; i < tokens.tokens.length; i++) { | ||||
|     const token = tokens.tokens[i]; | ||||
|     if ( | ||||
|       token.type === tt.name && | ||||
|       !token.isType && | ||||
|       (token.identifierRole === IdentifierRole.Access || | ||||
|         token.identifierRole === IdentifierRole.ObjectShorthand || | ||||
|         token.identifierRole === IdentifierRole.ExportAccess) && | ||||
|       !token.shadowsGlobal | ||||
|     ) { | ||||
|       nonTypeIdentifiers.add(tokens.identifierNameForToken(token)); | ||||
|     } | ||||
|     if (token.type === tt.jsxTagStart) { | ||||
|       nonTypeIdentifiers.add(jsxPragmaInfo.base); | ||||
|     } | ||||
|     if ( | ||||
|       token.type === tt.jsxTagStart && | ||||
|       i + 1 < tokens.tokens.length && | ||||
|       tokens.tokens[i + 1].type === tt.jsxTagEnd | ||||
|     ) { | ||||
|       nonTypeIdentifiers.add(jsxPragmaInfo.base); | ||||
|       nonTypeIdentifiers.add(jsxPragmaInfo.fragmentBase); | ||||
|     } | ||||
|     if (token.type === tt.jsxName && token.identifierRole === IdentifierRole.Access) { | ||||
|       const identifierName = tokens.identifierNameForToken(token); | ||||
|       // Lower-case single-component tag names like "div" don't count.
 | ||||
|       if (!startsWithLowerCase(identifierName) || tokens.tokens[i + 1].type === TokenType.dot) { | ||||
|         nonTypeIdentifiers.add(tokens.identifierNameForToken(token)); | ||||
|       } | ||||
|     } | ||||
|   } | ||||
|   return nonTypeIdentifiers; | ||||
| } | ||||
							
								
								
									
										84
									
								
								node_modules/sucrase/dist/esm/util/getTSImportedNames.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										84
									
								
								node_modules/sucrase/dist/esm/util/getTSImportedNames.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,84 @@ | |||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| import getImportExportSpecifierInfo from "./getImportExportSpecifierInfo"; | ||||
| 
 | ||||
| /** | ||||
|  * Special case code to scan for imported names in ESM TypeScript. We need to do this so we can | ||||
|  * properly get globals so we can compute shadowed globals. | ||||
|  * | ||||
|  * This is similar to logic in CJSImportProcessor, but trimmed down to avoid logic with CJS | ||||
|  * replacement and flow type imports. | ||||
|  */ | ||||
| export default function getTSImportedNames(tokens) { | ||||
|   const importedNames = new Set(); | ||||
|   for (let i = 0; i < tokens.tokens.length; i++) { | ||||
|     if ( | ||||
|       tokens.matches1AtIndex(i, tt._import) && | ||||
|       !tokens.matches3AtIndex(i, tt._import, tt.name, tt.eq) | ||||
|     ) { | ||||
|       collectNamesForImport(tokens, i, importedNames); | ||||
|     } | ||||
|   } | ||||
|   return importedNames; | ||||
| } | ||||
| 
 | ||||
| function collectNamesForImport( | ||||
|   tokens, | ||||
|   index, | ||||
|   importedNames, | ||||
| ) { | ||||
|   index++; | ||||
| 
 | ||||
|   if (tokens.matches1AtIndex(index, tt.parenL)) { | ||||
|     // Dynamic import, so nothing to do
 | ||||
|     return; | ||||
|   } | ||||
| 
 | ||||
|   if (tokens.matches1AtIndex(index, tt.name)) { | ||||
|     importedNames.add(tokens.identifierNameAtIndex(index)); | ||||
|     index++; | ||||
|     if (tokens.matches1AtIndex(index, tt.comma)) { | ||||
|       index++; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   if (tokens.matches1AtIndex(index, tt.star)) { | ||||
|     // * as
 | ||||
|     index += 2; | ||||
|     importedNames.add(tokens.identifierNameAtIndex(index)); | ||||
|     index++; | ||||
|   } | ||||
| 
 | ||||
|   if (tokens.matches1AtIndex(index, tt.braceL)) { | ||||
|     index++; | ||||
|     collectNamesForNamedImport(tokens, index, importedNames); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| function collectNamesForNamedImport( | ||||
|   tokens, | ||||
|   index, | ||||
|   importedNames, | ||||
| ) { | ||||
|   while (true) { | ||||
|     if (tokens.matches1AtIndex(index, tt.braceR)) { | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     const specifierInfo = getImportExportSpecifierInfo(tokens, index); | ||||
|     index = specifierInfo.endIndex; | ||||
|     if (!specifierInfo.isType) { | ||||
|       importedNames.add(specifierInfo.rightName); | ||||
|     } | ||||
| 
 | ||||
|     if (tokens.matches2AtIndex(index, tt.comma, tt.braceR)) { | ||||
|       return; | ||||
|     } else if (tokens.matches1AtIndex(index, tt.braceR)) { | ||||
|       return; | ||||
|     } else if (tokens.matches1AtIndex(index, tt.comma)) { | ||||
|       index++; | ||||
|     } else { | ||||
|       throw new Error(`Unexpected token: ${JSON.stringify(tokens.tokens[index])}`); | ||||
|     } | ||||
|   } | ||||
| } | ||||
							
								
								
									
										38
									
								
								node_modules/sucrase/dist/esm/util/isAsyncOperation.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										38
									
								
								node_modules/sucrase/dist/esm/util/isAsyncOperation.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,38 @@ | |||
| import {ContextualKeyword} from "../parser/tokenizer/keywords"; | ||||
| 
 | ||||
| 
 | ||||
| /** | ||||
|  * Determine whether this optional chain or nullish coalescing operation has any await statements in | ||||
|  * it. If so, we'll need to transpile to an async operation. | ||||
|  * | ||||
|  * We compute this by walking the length of the operation and returning true if we see an await | ||||
|  * keyword used as a real await (rather than an object key or property access). Nested optional | ||||
|  * chain/nullish operations need to be tracked but don't silence await, but a nested async function | ||||
|  * (or any other nested scope) will make the await not count. | ||||
|  */ | ||||
| export default function isAsyncOperation(tokens) { | ||||
|   let index = tokens.currentIndex(); | ||||
|   let depth = 0; | ||||
|   const startToken = tokens.currentToken(); | ||||
|   do { | ||||
|     const token = tokens.tokens[index]; | ||||
|     if (token.isOptionalChainStart) { | ||||
|       depth++; | ||||
|     } | ||||
|     if (token.isOptionalChainEnd) { | ||||
|       depth--; | ||||
|     } | ||||
|     depth += token.numNullishCoalesceStarts; | ||||
|     depth -= token.numNullishCoalesceEnds; | ||||
| 
 | ||||
|     if ( | ||||
|       token.contextualKeyword === ContextualKeyword._await && | ||||
|       token.identifierRole == null && | ||||
|       token.scopeDepth === startToken.scopeDepth | ||||
|     ) { | ||||
|       return true; | ||||
|     } | ||||
|     index += 1; | ||||
|   } while (depth > 0 && index < tokens.tokens.length); | ||||
|   return false; | ||||
| } | ||||
							
								
								
									
										18
									
								
								node_modules/sucrase/dist/esm/util/isExportFrom.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										18
									
								
								node_modules/sucrase/dist/esm/util/isExportFrom.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,18 @@ | |||
| import {ContextualKeyword} from "../parser/tokenizer/keywords"; | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| 
 | ||||
| /** | ||||
|  * Starting at `export {`, look ahead and return `true` if this is an | ||||
|  * `export {...} from` statement and `false` if this is a plain multi-export. | ||||
|  */ | ||||
| export default function isExportFrom(tokens) { | ||||
|   let closeBraceIndex = tokens.currentIndex(); | ||||
|   while (!tokens.matches1AtIndex(closeBraceIndex, tt.braceR)) { | ||||
|     closeBraceIndex++; | ||||
|   } | ||||
|   return ( | ||||
|     tokens.matchesContextualAtIndex(closeBraceIndex + 1, ContextualKeyword._from) && | ||||
|     tokens.matches1AtIndex(closeBraceIndex + 2, tt.string) | ||||
|   ); | ||||
| } | ||||
							
								
								
									
										81
									
								
								node_modules/sucrase/dist/esm/util/isIdentifier.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										81
									
								
								node_modules/sucrase/dist/esm/util/isIdentifier.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,81 @@ | |||
| import {IS_IDENTIFIER_CHAR, IS_IDENTIFIER_START} from "../parser/util/identifier"; | ||||
| 
 | ||||
| // https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Lexical_grammar
 | ||||
| // Hard-code a list of reserved words rather than trying to use keywords or contextual keywords
 | ||||
| // from the parser, since currently there are various exceptions, like `package` being reserved
 | ||||
| // but unused and various contextual keywords being reserved. Note that we assume that all code
 | ||||
| // compiled by Sucrase is in a module, so strict mode words and await are all considered reserved
 | ||||
| // here.
 | ||||
| const RESERVED_WORDS = new Set([ | ||||
|   // Reserved keywords as of ECMAScript 2015
 | ||||
|   "break", | ||||
|   "case", | ||||
|   "catch", | ||||
|   "class", | ||||
|   "const", | ||||
|   "continue", | ||||
|   "debugger", | ||||
|   "default", | ||||
|   "delete", | ||||
|   "do", | ||||
|   "else", | ||||
|   "export", | ||||
|   "extends", | ||||
|   "finally", | ||||
|   "for", | ||||
|   "function", | ||||
|   "if", | ||||
|   "import", | ||||
|   "in", | ||||
|   "instanceof", | ||||
|   "new", | ||||
|   "return", | ||||
|   "super", | ||||
|   "switch", | ||||
|   "this", | ||||
|   "throw", | ||||
|   "try", | ||||
|   "typeof", | ||||
|   "var", | ||||
|   "void", | ||||
|   "while", | ||||
|   "with", | ||||
|   "yield", | ||||
|   // Future reserved keywords
 | ||||
|   "enum", | ||||
|   "implements", | ||||
|   "interface", | ||||
|   "let", | ||||
|   "package", | ||||
|   "private", | ||||
|   "protected", | ||||
|   "public", | ||||
|   "static", | ||||
|   "await", | ||||
|   // Literals that cannot be used as identifiers
 | ||||
|   "false", | ||||
|   "null", | ||||
|   "true", | ||||
| ]); | ||||
| 
 | ||||
| /** | ||||
|  * Determine if the given name is a legal variable name. | ||||
|  * | ||||
|  * This is needed when transforming TypeScript enums; if an enum key is a valid | ||||
|  * variable name, it might be referenced later in the enum, so we need to | ||||
|  * declare a variable. | ||||
|  */ | ||||
| export default function isIdentifier(name) { | ||||
|   if (name.length === 0) { | ||||
|     return false; | ||||
|   } | ||||
|   if (!IS_IDENTIFIER_START[name.charCodeAt(0)]) { | ||||
|     return false; | ||||
|   } | ||||
|   for (let i = 1; i < name.length; i++) { | ||||
|     if (!IS_IDENTIFIER_CHAR[name.charCodeAt(i)]) { | ||||
|       return false; | ||||
|     } | ||||
|   } | ||||
|   return !RESERVED_WORDS.has(name); | ||||
| } | ||||
							
								
								
									
										22
									
								
								node_modules/sucrase/dist/esm/util/removeMaybeImportAttributes.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										22
									
								
								node_modules/sucrase/dist/esm/util/removeMaybeImportAttributes.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,22 @@ | |||
| import {ContextualKeyword} from "../parser/tokenizer/keywords"; | ||||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| 
 | ||||
| /** | ||||
|  * Starting at a potential `with` or (legacy) `assert` token, remove the import | ||||
|  * attributes if they exist. | ||||
|  */ | ||||
| export function removeMaybeImportAttributes(tokens) { | ||||
|   if ( | ||||
|     tokens.matches2(tt._with, tt.braceL) || | ||||
|     (tokens.matches2(tt.name, tt.braceL) && tokens.matchesContextual(ContextualKeyword._assert)) | ||||
|   ) { | ||||
|     // assert
 | ||||
|     tokens.removeToken(); | ||||
|     // {
 | ||||
|     tokens.removeToken(); | ||||
|     tokens.removeBalancedCode(); | ||||
|     // }
 | ||||
|     tokens.removeToken(); | ||||
|   } | ||||
| } | ||||
							
								
								
									
										38
									
								
								node_modules/sucrase/dist/esm/util/shouldElideDefaultExport.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										38
									
								
								node_modules/sucrase/dist/esm/util/shouldElideDefaultExport.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,38 @@ | |||
| import {TokenType as tt} from "../parser/tokenizer/types"; | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| /** | ||||
|  * Common method sharing code between CJS and ESM cases, since they're the same here. | ||||
|  */ | ||||
| export default function shouldElideDefaultExport( | ||||
|   isTypeScriptTransformEnabled, | ||||
|   keepUnusedImports, | ||||
|   tokens, | ||||
|   declarationInfo, | ||||
| ) { | ||||
|   if (!isTypeScriptTransformEnabled || keepUnusedImports) { | ||||
|     return false; | ||||
|   } | ||||
|   const exportToken = tokens.currentToken(); | ||||
|   if (exportToken.rhsEndIndex == null) { | ||||
|     throw new Error("Expected non-null rhsEndIndex on export token."); | ||||
|   } | ||||
|   // The export must be of the form `export default a` or `export default a;`.
 | ||||
|   const numTokens = exportToken.rhsEndIndex - tokens.currentIndex(); | ||||
|   if ( | ||||
|     numTokens !== 3 && | ||||
|     !(numTokens === 4 && tokens.matches1AtIndex(exportToken.rhsEndIndex - 1, tt.semi)) | ||||
|   ) { | ||||
|     return false; | ||||
|   } | ||||
|   const identifierToken = tokens.tokenAtRelativeIndex(2); | ||||
|   if (identifierToken.type !== tt.name) { | ||||
|     return false; | ||||
|   } | ||||
|   const exportedName = tokens.identifierNameForToken(identifierToken); | ||||
|   return ( | ||||
|     declarationInfo.typeDeclarations.has(exportedName) && | ||||
|     !declarationInfo.valueDeclarations.has(exportedName) | ||||
|   ); | ||||
| } | ||||
							
								
								
									
										98
									
								
								node_modules/sucrase/dist/identifyShadowedGlobals.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										98
									
								
								node_modules/sucrase/dist/identifyShadowedGlobals.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,98 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| var _tokenizer = require('./parser/tokenizer'); | ||||
| 
 | ||||
| var _types = require('./parser/tokenizer/types'); | ||||
| 
 | ||||
| 
 | ||||
| /** | ||||
|  * Traverse the given tokens and modify them if necessary to indicate that some names shadow global | ||||
|  * variables. | ||||
|  */ | ||||
|  function identifyShadowedGlobals( | ||||
|   tokens, | ||||
|   scopes, | ||||
|   globalNames, | ||||
| ) { | ||||
|   if (!hasShadowedGlobals(tokens, globalNames)) { | ||||
|     return; | ||||
|   } | ||||
|   markShadowedGlobals(tokens, scopes, globalNames); | ||||
| } exports.default = identifyShadowedGlobals; | ||||
| 
 | ||||
| /** | ||||
|  * We can do a fast up-front check to see if there are any declarations to global names. If not, | ||||
|  * then there's no point in computing scope assignments. | ||||
|  */ | ||||
| // Exported for testing.
 | ||||
|  function hasShadowedGlobals(tokens, globalNames) { | ||||
|   for (const token of tokens.tokens) { | ||||
|     if ( | ||||
|       token.type === _types.TokenType.name && | ||||
|       !token.isType && | ||||
|       _tokenizer.isNonTopLevelDeclaration.call(void 0, token) && | ||||
|       globalNames.has(tokens.identifierNameForToken(token)) | ||||
|     ) { | ||||
|       return true; | ||||
|     } | ||||
|   } | ||||
|   return false; | ||||
| } exports.hasShadowedGlobals = hasShadowedGlobals; | ||||
| 
 | ||||
| function markShadowedGlobals( | ||||
|   tokens, | ||||
|   scopes, | ||||
|   globalNames, | ||||
| ) { | ||||
|   const scopeStack = []; | ||||
|   let scopeIndex = scopes.length - 1; | ||||
|   // Scopes were generated at completion time, so they're sorted by end index, so we can maintain a
 | ||||
|   // good stack by going backwards through them.
 | ||||
|   for (let i = tokens.tokens.length - 1; ; i--) { | ||||
|     while (scopeStack.length > 0 && scopeStack[scopeStack.length - 1].startTokenIndex === i + 1) { | ||||
|       scopeStack.pop(); | ||||
|     } | ||||
|     while (scopeIndex >= 0 && scopes[scopeIndex].endTokenIndex === i + 1) { | ||||
|       scopeStack.push(scopes[scopeIndex]); | ||||
|       scopeIndex--; | ||||
|     } | ||||
|     // Process scopes after the last iteration so we can make sure we pop all of them.
 | ||||
|     if (i < 0) { | ||||
|       break; | ||||
|     } | ||||
| 
 | ||||
|     const token = tokens.tokens[i]; | ||||
|     const name = tokens.identifierNameForToken(token); | ||||
|     if (scopeStack.length > 1 && !token.isType && token.type === _types.TokenType.name && globalNames.has(name)) { | ||||
|       if (_tokenizer.isBlockScopedDeclaration.call(void 0, token)) { | ||||
|         markShadowedForScope(scopeStack[scopeStack.length - 1], tokens, name); | ||||
|       } else if (_tokenizer.isFunctionScopedDeclaration.call(void 0, token)) { | ||||
|         let stackIndex = scopeStack.length - 1; | ||||
|         while (stackIndex > 0 && !scopeStack[stackIndex].isFunctionScope) { | ||||
|           stackIndex--; | ||||
|         } | ||||
|         if (stackIndex < 0) { | ||||
|           throw new Error("Did not find parent function scope."); | ||||
|         } | ||||
|         markShadowedForScope(scopeStack[stackIndex], tokens, name); | ||||
|       } | ||||
|     } | ||||
|   } | ||||
|   if (scopeStack.length > 0) { | ||||
|     throw new Error("Expected empty scope stack after processing file."); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| function markShadowedForScope(scope, tokens, name) { | ||||
|   for (let i = scope.startTokenIndex; i < scope.endTokenIndex; i++) { | ||||
|     const token = tokens.tokens[i]; | ||||
|     if ( | ||||
|       (token.type === _types.TokenType.name || token.type === _types.TokenType.jsxName) && | ||||
|       tokens.identifierNameForToken(token) === name | ||||
|     ) { | ||||
|       token.shadowsGlobal = true; | ||||
|     } | ||||
|   } | ||||
| } | ||||
							
								
								
									
										133
									
								
								node_modules/sucrase/dist/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										133
									
								
								node_modules/sucrase/dist/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,133 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }var _CJSImportProcessor = require('./CJSImportProcessor'); var _CJSImportProcessor2 = _interopRequireDefault(_CJSImportProcessor); | ||||
| var _computeSourceMap = require('./computeSourceMap'); var _computeSourceMap2 = _interopRequireDefault(_computeSourceMap); | ||||
| var _HelperManager = require('./HelperManager'); | ||||
| var _identifyShadowedGlobals = require('./identifyShadowedGlobals'); var _identifyShadowedGlobals2 = _interopRequireDefault(_identifyShadowedGlobals); | ||||
| var _NameManager = require('./NameManager'); var _NameManager2 = _interopRequireDefault(_NameManager); | ||||
| var _Options = require('./Options'); | ||||
| 
 | ||||
| var _parser = require('./parser'); | ||||
| 
 | ||||
| var _TokenProcessor = require('./TokenProcessor'); var _TokenProcessor2 = _interopRequireDefault(_TokenProcessor); | ||||
| var _RootTransformer = require('./transformers/RootTransformer'); var _RootTransformer2 = _interopRequireDefault(_RootTransformer); | ||||
| var _formatTokens = require('./util/formatTokens'); var _formatTokens2 = _interopRequireDefault(_formatTokens); | ||||
| var _getTSImportedNames = require('./util/getTSImportedNames'); var _getTSImportedNames2 = _interopRequireDefault(_getTSImportedNames); | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| ; | ||||
| 
 | ||||
|  function getVersion() { | ||||
|   /* istanbul ignore next */ | ||||
|   return "3.34.0"; | ||||
| } exports.getVersion = getVersion; | ||||
| 
 | ||||
|  function transform(code, options) { | ||||
|   _Options.validateOptions.call(void 0, options); | ||||
|   try { | ||||
|     const sucraseContext = getSucraseContext(code, options); | ||||
|     const transformer = new (0, _RootTransformer2.default)( | ||||
|       sucraseContext, | ||||
|       options.transforms, | ||||
|       Boolean(options.enableLegacyBabel5ModuleInterop), | ||||
|       options, | ||||
|     ); | ||||
|     const transformerResult = transformer.transform(); | ||||
|     let result = {code: transformerResult.code}; | ||||
|     if (options.sourceMapOptions) { | ||||
|       if (!options.filePath) { | ||||
|         throw new Error("filePath must be specified when generating a source map."); | ||||
|       } | ||||
|       result = { | ||||
|         ...result, | ||||
|         sourceMap: _computeSourceMap2.default.call(void 0,  | ||||
|           transformerResult, | ||||
|           options.filePath, | ||||
|           options.sourceMapOptions, | ||||
|           code, | ||||
|           sucraseContext.tokenProcessor.tokens, | ||||
|         ), | ||||
|       }; | ||||
|     } | ||||
|     return result; | ||||
|     // eslint-disable-next-line @typescript-eslint/no-explicit-any
 | ||||
|   } catch (e) { | ||||
|     if (options.filePath) { | ||||
|       e.message = `Error transforming ${options.filePath}: ${e.message}`; | ||||
|     } | ||||
|     throw e; | ||||
|   } | ||||
| } exports.transform = transform; | ||||
| 
 | ||||
| /** | ||||
|  * Return a string representation of the sucrase tokens, mostly useful for | ||||
|  * diagnostic purposes. | ||||
|  */ | ||||
|  function getFormattedTokens(code, options) { | ||||
|   const tokens = getSucraseContext(code, options).tokenProcessor.tokens; | ||||
|   return _formatTokens2.default.call(void 0, code, tokens); | ||||
| } exports.getFormattedTokens = getFormattedTokens; | ||||
| 
 | ||||
| /** | ||||
|  * Call into the parser/tokenizer and do some further preprocessing: | ||||
|  * - Come up with a set of used names so that we can assign new names. | ||||
|  * - Preprocess all import/export statements so we know which globals we are interested in. | ||||
|  * - Compute situations where any of those globals are shadowed. | ||||
|  * | ||||
|  * In the future, some of these preprocessing steps can be skipped based on what actual work is | ||||
|  * being done. | ||||
|  */ | ||||
| function getSucraseContext(code, options) { | ||||
|   const isJSXEnabled = options.transforms.includes("jsx"); | ||||
|   const isTypeScriptEnabled = options.transforms.includes("typescript"); | ||||
|   const isFlowEnabled = options.transforms.includes("flow"); | ||||
|   const disableESTransforms = options.disableESTransforms === true; | ||||
|   const file = _parser.parse.call(void 0, code, isJSXEnabled, isTypeScriptEnabled, isFlowEnabled); | ||||
|   const tokens = file.tokens; | ||||
|   const scopes = file.scopes; | ||||
| 
 | ||||
|   const nameManager = new (0, _NameManager2.default)(code, tokens); | ||||
|   const helperManager = new (0, _HelperManager.HelperManager)(nameManager); | ||||
|   const tokenProcessor = new (0, _TokenProcessor2.default)( | ||||
|     code, | ||||
|     tokens, | ||||
|     isFlowEnabled, | ||||
|     disableESTransforms, | ||||
|     helperManager, | ||||
|   ); | ||||
|   const enableLegacyTypeScriptModuleInterop = Boolean(options.enableLegacyTypeScriptModuleInterop); | ||||
| 
 | ||||
|   let importProcessor = null; | ||||
|   if (options.transforms.includes("imports")) { | ||||
|     importProcessor = new (0, _CJSImportProcessor2.default)( | ||||
|       nameManager, | ||||
|       tokenProcessor, | ||||
|       enableLegacyTypeScriptModuleInterop, | ||||
|       options, | ||||
|       options.transforms.includes("typescript"), | ||||
|       Boolean(options.keepUnusedImports), | ||||
|       helperManager, | ||||
|     ); | ||||
|     importProcessor.preprocessTokens(); | ||||
|     // We need to mark shadowed globals after processing imports so we know that the globals are,
 | ||||
|     // but before type-only import pruning, since that relies on shadowing information.
 | ||||
|     _identifyShadowedGlobals2.default.call(void 0, tokenProcessor, scopes, importProcessor.getGlobalNames()); | ||||
|     if (options.transforms.includes("typescript") && !options.keepUnusedImports) { | ||||
|       importProcessor.pruneTypeOnlyImports(); | ||||
|     } | ||||
|   } else if (options.transforms.includes("typescript") && !options.keepUnusedImports) { | ||||
|     // Shadowed global detection is needed for TS implicit elision of imported names.
 | ||||
|     _identifyShadowedGlobals2.default.call(void 0, tokenProcessor, scopes, _getTSImportedNames2.default.call(void 0, tokenProcessor)); | ||||
|   } | ||||
|   return {tokenProcessor, scopes, nameManager, importProcessor, helperManager}; | ||||
| } | ||||
							
								
								
									
										31
									
								
								node_modules/sucrase/dist/parser/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										31
									
								
								node_modules/sucrase/dist/parser/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,31 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); | ||||
| 
 | ||||
| var _base = require('./traverser/base'); | ||||
| var _index = require('./traverser/index'); | ||||
| 
 | ||||
|  class File { | ||||
|    | ||||
|    | ||||
| 
 | ||||
|   constructor(tokens, scopes) { | ||||
|     this.tokens = tokens; | ||||
|     this.scopes = scopes; | ||||
|   } | ||||
| } exports.File = File; | ||||
| 
 | ||||
|  function parse( | ||||
|   input, | ||||
|   isJSXEnabled, | ||||
|   isTypeScriptEnabled, | ||||
|   isFlowEnabled, | ||||
| ) { | ||||
|   if (isFlowEnabled && isTypeScriptEnabled) { | ||||
|     throw new Error("Cannot combine flow and typescript plugins."); | ||||
|   } | ||||
|   _base.initParser.call(void 0, input, isJSXEnabled, isTypeScriptEnabled, isFlowEnabled); | ||||
|   const result = _index.parseFile.call(void 0, ); | ||||
|   if (_base.state.error) { | ||||
|     throw _base.augmentError.call(void 0, _base.state.error); | ||||
|   } | ||||
|   return result; | ||||
| } exports.parse = parse; | ||||
							
								
								
									
										1105
									
								
								node_modules/sucrase/dist/parser/plugins/flow.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										1105
									
								
								node_modules/sucrase/dist/parser/plugins/flow.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
										
											
												File diff suppressed because it is too large
												Load diff
											
										
									
								
							
							
								
								
									
										367
									
								
								node_modules/sucrase/dist/parser/plugins/jsx/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										367
									
								
								node_modules/sucrase/dist/parser/plugins/jsx/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,367 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| var _index = require('../../tokenizer/index'); | ||||
| var _types = require('../../tokenizer/types'); | ||||
| var _base = require('../../traverser/base'); | ||||
| var _expression = require('../../traverser/expression'); | ||||
| var _util = require('../../traverser/util'); | ||||
| var _charcodes = require('../../util/charcodes'); | ||||
| var _identifier = require('../../util/identifier'); | ||||
| var _typescript = require('../typescript'); | ||||
| 
 | ||||
| /** | ||||
|  * Read token with JSX contents. | ||||
|  * | ||||
|  * In addition to detecting jsxTagStart and also regular tokens that might be | ||||
|  * part of an expression, this code detects the start and end of text ranges | ||||
|  * within JSX children. In order to properly count the number of children, we | ||||
|  * distinguish jsxText from jsxEmptyText, which is a text range that simplifies | ||||
|  * to the empty string after JSX whitespace trimming. | ||||
|  * | ||||
|  * It turns out that a JSX text range will simplify to the empty string if and | ||||
|  * only if both of these conditions hold: | ||||
|  * - The range consists entirely of whitespace characters (only counting space, | ||||
|  *   tab, \r, and \n). | ||||
|  * - The range has at least one newline. | ||||
|  * This can be proven by analyzing any implementation of whitespace trimming, | ||||
|  * e.g. formatJSXTextLiteral in Sucrase or cleanJSXElementLiteralChild in Babel. | ||||
|  */ | ||||
| function jsxReadToken() { | ||||
|   let sawNewline = false; | ||||
|   let sawNonWhitespace = false; | ||||
|   while (true) { | ||||
|     if (_base.state.pos >= _base.input.length) { | ||||
|       _util.unexpected.call(void 0, "Unterminated JSX contents"); | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     const ch = _base.input.charCodeAt(_base.state.pos); | ||||
|     if (ch === _charcodes.charCodes.lessThan || ch === _charcodes.charCodes.leftCurlyBrace) { | ||||
|       if (_base.state.pos === _base.state.start) { | ||||
|         if (ch === _charcodes.charCodes.lessThan) { | ||||
|           _base.state.pos++; | ||||
|           _index.finishToken.call(void 0, _types.TokenType.jsxTagStart); | ||||
|           return; | ||||
|         } | ||||
|         _index.getTokenFromCode.call(void 0, ch); | ||||
|         return; | ||||
|       } | ||||
|       if (sawNewline && !sawNonWhitespace) { | ||||
|         _index.finishToken.call(void 0, _types.TokenType.jsxEmptyText); | ||||
|       } else { | ||||
|         _index.finishToken.call(void 0, _types.TokenType.jsxText); | ||||
|       } | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     // This is part of JSX text.
 | ||||
|     if (ch === _charcodes.charCodes.lineFeed) { | ||||
|       sawNewline = true; | ||||
|     } else if (ch !== _charcodes.charCodes.space && ch !== _charcodes.charCodes.carriageReturn && ch !== _charcodes.charCodes.tab) { | ||||
|       sawNonWhitespace = true; | ||||
|     } | ||||
|     _base.state.pos++; | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| function jsxReadString(quote) { | ||||
|   _base.state.pos++; | ||||
|   for (;;) { | ||||
|     if (_base.state.pos >= _base.input.length) { | ||||
|       _util.unexpected.call(void 0, "Unterminated string constant"); | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     const ch = _base.input.charCodeAt(_base.state.pos); | ||||
|     if (ch === quote) { | ||||
|       _base.state.pos++; | ||||
|       break; | ||||
|     } | ||||
|     _base.state.pos++; | ||||
|   } | ||||
|   _index.finishToken.call(void 0, _types.TokenType.string); | ||||
| } | ||||
| 
 | ||||
| // Read a JSX identifier (valid tag or attribute name).
 | ||||
| //
 | ||||
| // Optimized version since JSX identifiers can't contain
 | ||||
| // escape characters and so can be read as single slice.
 | ||||
| // Also assumes that first character was already checked
 | ||||
| // by isIdentifierStart in readToken.
 | ||||
| 
 | ||||
| function jsxReadWord() { | ||||
|   let ch; | ||||
|   do { | ||||
|     if (_base.state.pos > _base.input.length) { | ||||
|       _util.unexpected.call(void 0, "Unexpectedly reached the end of input."); | ||||
|       return; | ||||
|     } | ||||
|     ch = _base.input.charCodeAt(++_base.state.pos); | ||||
|   } while (_identifier.IS_IDENTIFIER_CHAR[ch] || ch === _charcodes.charCodes.dash); | ||||
|   _index.finishToken.call(void 0, _types.TokenType.jsxName); | ||||
| } | ||||
| 
 | ||||
| // Parse next token as JSX identifier
 | ||||
| function jsxParseIdentifier() { | ||||
|   nextJSXTagToken(); | ||||
| } | ||||
| 
 | ||||
| // Parse namespaced identifier.
 | ||||
| function jsxParseNamespacedName(identifierRole) { | ||||
|   jsxParseIdentifier(); | ||||
|   if (!_index.eat.call(void 0, _types.TokenType.colon)) { | ||||
|     // Plain identifier, so this is an access.
 | ||||
|     _base.state.tokens[_base.state.tokens.length - 1].identifierRole = identifierRole; | ||||
|     return; | ||||
|   } | ||||
|   // Process the second half of the namespaced name.
 | ||||
|   jsxParseIdentifier(); | ||||
| } | ||||
| 
 | ||||
| // Parses element name in any form - namespaced, member
 | ||||
| // or single identifier.
 | ||||
| function jsxParseElementName() { | ||||
|   const firstTokenIndex = _base.state.tokens.length; | ||||
|   jsxParseNamespacedName(_index.IdentifierRole.Access); | ||||
|   let hadDot = false; | ||||
|   while (_index.match.call(void 0, _types.TokenType.dot)) { | ||||
|     hadDot = true; | ||||
|     nextJSXTagToken(); | ||||
|     jsxParseIdentifier(); | ||||
|   } | ||||
|   // For tags like <div> with a lowercase letter and no dots, the name is
 | ||||
|   // actually *not* an identifier access, since it's referring to a built-in
 | ||||
|   // tag name. Remove the identifier role in this case so that it's not
 | ||||
|   // accidentally transformed by the imports transform when preserving JSX.
 | ||||
|   if (!hadDot) { | ||||
|     const firstToken = _base.state.tokens[firstTokenIndex]; | ||||
|     const firstChar = _base.input.charCodeAt(firstToken.start); | ||||
|     if (firstChar >= _charcodes.charCodes.lowercaseA && firstChar <= _charcodes.charCodes.lowercaseZ) { | ||||
|       firstToken.identifierRole = null; | ||||
|     } | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| // Parses any type of JSX attribute value.
 | ||||
| function jsxParseAttributeValue() { | ||||
|   switch (_base.state.type) { | ||||
|     case _types.TokenType.braceL: | ||||
|       _index.next.call(void 0, ); | ||||
|       _expression.parseExpression.call(void 0, ); | ||||
|       nextJSXTagToken(); | ||||
|       return; | ||||
| 
 | ||||
|     case _types.TokenType.jsxTagStart: | ||||
|       jsxParseElement(); | ||||
|       nextJSXTagToken(); | ||||
|       return; | ||||
| 
 | ||||
|     case _types.TokenType.string: | ||||
|       nextJSXTagToken(); | ||||
|       return; | ||||
| 
 | ||||
|     default: | ||||
|       _util.unexpected.call(void 0, "JSX value should be either an expression or a quoted JSX text"); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| // Parse JSX spread child, after already processing the {
 | ||||
| // Does not parse the closing }
 | ||||
| function jsxParseSpreadChild() { | ||||
|   _util.expect.call(void 0, _types.TokenType.ellipsis); | ||||
|   _expression.parseExpression.call(void 0, ); | ||||
| } | ||||
| 
 | ||||
| // Parses JSX opening tag starting after "<".
 | ||||
| // Returns true if the tag was self-closing.
 | ||||
| // Does not parse the last token.
 | ||||
| function jsxParseOpeningElement(initialTokenIndex) { | ||||
|   if (_index.match.call(void 0, _types.TokenType.jsxTagEnd)) { | ||||
|     // This is an open-fragment.
 | ||||
|     return false; | ||||
|   } | ||||
|   jsxParseElementName(); | ||||
|   if (_base.isTypeScriptEnabled) { | ||||
|     _typescript.tsTryParseJSXTypeArgument.call(void 0, ); | ||||
|   } | ||||
|   let hasSeenPropSpread = false; | ||||
|   while (!_index.match.call(void 0, _types.TokenType.slash) && !_index.match.call(void 0, _types.TokenType.jsxTagEnd) && !_base.state.error) { | ||||
|     if (_index.eat.call(void 0, _types.TokenType.braceL)) { | ||||
|       hasSeenPropSpread = true; | ||||
|       _util.expect.call(void 0, _types.TokenType.ellipsis); | ||||
|       _expression.parseMaybeAssign.call(void 0, ); | ||||
|       // }
 | ||||
|       nextJSXTagToken(); | ||||
|       continue; | ||||
|     } | ||||
|     if ( | ||||
|       hasSeenPropSpread && | ||||
|       _base.state.end - _base.state.start === 3 && | ||||
|       _base.input.charCodeAt(_base.state.start) === _charcodes.charCodes.lowercaseK && | ||||
|       _base.input.charCodeAt(_base.state.start + 1) === _charcodes.charCodes.lowercaseE && | ||||
|       _base.input.charCodeAt(_base.state.start + 2) === _charcodes.charCodes.lowercaseY | ||||
|     ) { | ||||
|       _base.state.tokens[initialTokenIndex].jsxRole = _index.JSXRole.KeyAfterPropSpread; | ||||
|     } | ||||
|     jsxParseNamespacedName(_index.IdentifierRole.ObjectKey); | ||||
|     if (_index.match.call(void 0, _types.TokenType.eq)) { | ||||
|       nextJSXTagToken(); | ||||
|       jsxParseAttributeValue(); | ||||
|     } | ||||
|   } | ||||
|   const isSelfClosing = _index.match.call(void 0, _types.TokenType.slash); | ||||
|   if (isSelfClosing) { | ||||
|     // /
 | ||||
|     nextJSXTagToken(); | ||||
|   } | ||||
|   return isSelfClosing; | ||||
| } | ||||
| 
 | ||||
| // Parses JSX closing tag starting after "</".
 | ||||
| // Does not parse the last token.
 | ||||
| function jsxParseClosingElement() { | ||||
|   if (_index.match.call(void 0, _types.TokenType.jsxTagEnd)) { | ||||
|     // Fragment syntax, so we immediately have a tag end.
 | ||||
|     return; | ||||
|   } | ||||
|   jsxParseElementName(); | ||||
| } | ||||
| 
 | ||||
| // Parses entire JSX element, including its opening tag
 | ||||
| // (starting after "<"), attributes, contents and closing tag.
 | ||||
| // Does not parse the last token.
 | ||||
| function jsxParseElementAt() { | ||||
|   const initialTokenIndex = _base.state.tokens.length - 1; | ||||
|   _base.state.tokens[initialTokenIndex].jsxRole = _index.JSXRole.NoChildren; | ||||
|   let numExplicitChildren = 0; | ||||
|   const isSelfClosing = jsxParseOpeningElement(initialTokenIndex); | ||||
|   if (!isSelfClosing) { | ||||
|     nextJSXExprToken(); | ||||
|     while (true) { | ||||
|       switch (_base.state.type) { | ||||
|         case _types.TokenType.jsxTagStart: | ||||
|           nextJSXTagToken(); | ||||
|           if (_index.match.call(void 0, _types.TokenType.slash)) { | ||||
|             nextJSXTagToken(); | ||||
|             jsxParseClosingElement(); | ||||
|             // Key after prop spread takes precedence over number of children,
 | ||||
|             // since it means we switch to createElement, which doesn't care
 | ||||
|             // about number of children.
 | ||||
|             if (_base.state.tokens[initialTokenIndex].jsxRole !== _index.JSXRole.KeyAfterPropSpread) { | ||||
|               if (numExplicitChildren === 1) { | ||||
|                 _base.state.tokens[initialTokenIndex].jsxRole = _index.JSXRole.OneChild; | ||||
|               } else if (numExplicitChildren > 1) { | ||||
|                 _base.state.tokens[initialTokenIndex].jsxRole = _index.JSXRole.StaticChildren; | ||||
|               } | ||||
|             } | ||||
|             return; | ||||
|           } | ||||
|           numExplicitChildren++; | ||||
|           jsxParseElementAt(); | ||||
|           nextJSXExprToken(); | ||||
|           break; | ||||
| 
 | ||||
|         case _types.TokenType.jsxText: | ||||
|           numExplicitChildren++; | ||||
|           nextJSXExprToken(); | ||||
|           break; | ||||
| 
 | ||||
|         case _types.TokenType.jsxEmptyText: | ||||
|           nextJSXExprToken(); | ||||
|           break; | ||||
| 
 | ||||
|         case _types.TokenType.braceL: | ||||
|           _index.next.call(void 0, ); | ||||
|           if (_index.match.call(void 0, _types.TokenType.ellipsis)) { | ||||
|             jsxParseSpreadChild(); | ||||
|             nextJSXExprToken(); | ||||
|             // Spread children are a mechanism to explicitly mark children as
 | ||||
|             // static, so count it as 2 children to satisfy the "more than one
 | ||||
|             // child" condition.
 | ||||
|             numExplicitChildren += 2; | ||||
|           } else { | ||||
|             // If we see {}, this is an empty pseudo-expression that doesn't
 | ||||
|             // count as a child.
 | ||||
|             if (!_index.match.call(void 0, _types.TokenType.braceR)) { | ||||
|               numExplicitChildren++; | ||||
|               _expression.parseExpression.call(void 0, ); | ||||
|             } | ||||
|             nextJSXExprToken(); | ||||
|           } | ||||
| 
 | ||||
|           break; | ||||
| 
 | ||||
|         // istanbul ignore next - should never happen
 | ||||
|         default: | ||||
|           _util.unexpected.call(void 0, ); | ||||
|           return; | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| // Parses entire JSX element from current position.
 | ||||
| // Does not parse the last token.
 | ||||
|  function jsxParseElement() { | ||||
|   nextJSXTagToken(); | ||||
|   jsxParseElementAt(); | ||||
| } exports.jsxParseElement = jsxParseElement; | ||||
| 
 | ||||
| // ==================================
 | ||||
| // Overrides
 | ||||
| // ==================================
 | ||||
| 
 | ||||
|  function nextJSXTagToken() { | ||||
|   _base.state.tokens.push(new (0, _index.Token)()); | ||||
|   _index.skipSpace.call(void 0, ); | ||||
|   _base.state.start = _base.state.pos; | ||||
|   const code = _base.input.charCodeAt(_base.state.pos); | ||||
| 
 | ||||
|   if (_identifier.IS_IDENTIFIER_START[code]) { | ||||
|     jsxReadWord(); | ||||
|   } else if (code === _charcodes.charCodes.quotationMark || code === _charcodes.charCodes.apostrophe) { | ||||
|     jsxReadString(code); | ||||
|   } else { | ||||
|     // The following tokens are just one character each.
 | ||||
|     ++_base.state.pos; | ||||
|     switch (code) { | ||||
|       case _charcodes.charCodes.greaterThan: | ||||
|         _index.finishToken.call(void 0, _types.TokenType.jsxTagEnd); | ||||
|         break; | ||||
|       case _charcodes.charCodes.lessThan: | ||||
|         _index.finishToken.call(void 0, _types.TokenType.jsxTagStart); | ||||
|         break; | ||||
|       case _charcodes.charCodes.slash: | ||||
|         _index.finishToken.call(void 0, _types.TokenType.slash); | ||||
|         break; | ||||
|       case _charcodes.charCodes.equalsTo: | ||||
|         _index.finishToken.call(void 0, _types.TokenType.eq); | ||||
|         break; | ||||
|       case _charcodes.charCodes.leftCurlyBrace: | ||||
|         _index.finishToken.call(void 0, _types.TokenType.braceL); | ||||
|         break; | ||||
|       case _charcodes.charCodes.dot: | ||||
|         _index.finishToken.call(void 0, _types.TokenType.dot); | ||||
|         break; | ||||
|       case _charcodes.charCodes.colon: | ||||
|         _index.finishToken.call(void 0, _types.TokenType.colon); | ||||
|         break; | ||||
|       default: | ||||
|         _util.unexpected.call(void 0, ); | ||||
|     } | ||||
|   } | ||||
| } exports.nextJSXTagToken = nextJSXTagToken; | ||||
| 
 | ||||
| function nextJSXExprToken() { | ||||
|   _base.state.tokens.push(new (0, _index.Token)()); | ||||
|   _base.state.start = _base.state.pos; | ||||
|   jsxReadToken(); | ||||
| } | ||||
							
								
								
									
										256
									
								
								node_modules/sucrase/dist/parser/plugins/jsx/xhtml.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										256
									
								
								node_modules/sucrase/dist/parser/plugins/jsx/xhtml.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,256 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true});// Use a Map rather than object to avoid unexpected __proto__ access.
 | ||||
| exports. default = new Map([ | ||||
|   ["quot", "\u0022"], | ||||
|   ["amp", "&"], | ||||
|   ["apos", "\u0027"], | ||||
|   ["lt", "<"], | ||||
|   ["gt", ">"], | ||||
|   ["nbsp", "\u00A0"], | ||||
|   ["iexcl", "\u00A1"], | ||||
|   ["cent", "\u00A2"], | ||||
|   ["pound", "\u00A3"], | ||||
|   ["curren", "\u00A4"], | ||||
|   ["yen", "\u00A5"], | ||||
|   ["brvbar", "\u00A6"], | ||||
|   ["sect", "\u00A7"], | ||||
|   ["uml", "\u00A8"], | ||||
|   ["copy", "\u00A9"], | ||||
|   ["ordf", "\u00AA"], | ||||
|   ["laquo", "\u00AB"], | ||||
|   ["not", "\u00AC"], | ||||
|   ["shy", "\u00AD"], | ||||
|   ["reg", "\u00AE"], | ||||
|   ["macr", "\u00AF"], | ||||
|   ["deg", "\u00B0"], | ||||
|   ["plusmn", "\u00B1"], | ||||
|   ["sup2", "\u00B2"], | ||||
|   ["sup3", "\u00B3"], | ||||
|   ["acute", "\u00B4"], | ||||
|   ["micro", "\u00B5"], | ||||
|   ["para", "\u00B6"], | ||||
|   ["middot", "\u00B7"], | ||||
|   ["cedil", "\u00B8"], | ||||
|   ["sup1", "\u00B9"], | ||||
|   ["ordm", "\u00BA"], | ||||
|   ["raquo", "\u00BB"], | ||||
|   ["frac14", "\u00BC"], | ||||
|   ["frac12", "\u00BD"], | ||||
|   ["frac34", "\u00BE"], | ||||
|   ["iquest", "\u00BF"], | ||||
|   ["Agrave", "\u00C0"], | ||||
|   ["Aacute", "\u00C1"], | ||||
|   ["Acirc", "\u00C2"], | ||||
|   ["Atilde", "\u00C3"], | ||||
|   ["Auml", "\u00C4"], | ||||
|   ["Aring", "\u00C5"], | ||||
|   ["AElig", "\u00C6"], | ||||
|   ["Ccedil", "\u00C7"], | ||||
|   ["Egrave", "\u00C8"], | ||||
|   ["Eacute", "\u00C9"], | ||||
|   ["Ecirc", "\u00CA"], | ||||
|   ["Euml", "\u00CB"], | ||||
|   ["Igrave", "\u00CC"], | ||||
|   ["Iacute", "\u00CD"], | ||||
|   ["Icirc", "\u00CE"], | ||||
|   ["Iuml", "\u00CF"], | ||||
|   ["ETH", "\u00D0"], | ||||
|   ["Ntilde", "\u00D1"], | ||||
|   ["Ograve", "\u00D2"], | ||||
|   ["Oacute", "\u00D3"], | ||||
|   ["Ocirc", "\u00D4"], | ||||
|   ["Otilde", "\u00D5"], | ||||
|   ["Ouml", "\u00D6"], | ||||
|   ["times", "\u00D7"], | ||||
|   ["Oslash", "\u00D8"], | ||||
|   ["Ugrave", "\u00D9"], | ||||
|   ["Uacute", "\u00DA"], | ||||
|   ["Ucirc", "\u00DB"], | ||||
|   ["Uuml", "\u00DC"], | ||||
|   ["Yacute", "\u00DD"], | ||||
|   ["THORN", "\u00DE"], | ||||
|   ["szlig", "\u00DF"], | ||||
|   ["agrave", "\u00E0"], | ||||
|   ["aacute", "\u00E1"], | ||||
|   ["acirc", "\u00E2"], | ||||
|   ["atilde", "\u00E3"], | ||||
|   ["auml", "\u00E4"], | ||||
|   ["aring", "\u00E5"], | ||||
|   ["aelig", "\u00E6"], | ||||
|   ["ccedil", "\u00E7"], | ||||
|   ["egrave", "\u00E8"], | ||||
|   ["eacute", "\u00E9"], | ||||
|   ["ecirc", "\u00EA"], | ||||
|   ["euml", "\u00EB"], | ||||
|   ["igrave", "\u00EC"], | ||||
|   ["iacute", "\u00ED"], | ||||
|   ["icirc", "\u00EE"], | ||||
|   ["iuml", "\u00EF"], | ||||
|   ["eth", "\u00F0"], | ||||
|   ["ntilde", "\u00F1"], | ||||
|   ["ograve", "\u00F2"], | ||||
|   ["oacute", "\u00F3"], | ||||
|   ["ocirc", "\u00F4"], | ||||
|   ["otilde", "\u00F5"], | ||||
|   ["ouml", "\u00F6"], | ||||
|   ["divide", "\u00F7"], | ||||
|   ["oslash", "\u00F8"], | ||||
|   ["ugrave", "\u00F9"], | ||||
|   ["uacute", "\u00FA"], | ||||
|   ["ucirc", "\u00FB"], | ||||
|   ["uuml", "\u00FC"], | ||||
|   ["yacute", "\u00FD"], | ||||
|   ["thorn", "\u00FE"], | ||||
|   ["yuml", "\u00FF"], | ||||
|   ["OElig", "\u0152"], | ||||
|   ["oelig", "\u0153"], | ||||
|   ["Scaron", "\u0160"], | ||||
|   ["scaron", "\u0161"], | ||||
|   ["Yuml", "\u0178"], | ||||
|   ["fnof", "\u0192"], | ||||
|   ["circ", "\u02C6"], | ||||
|   ["tilde", "\u02DC"], | ||||
|   ["Alpha", "\u0391"], | ||||
|   ["Beta", "\u0392"], | ||||
|   ["Gamma", "\u0393"], | ||||
|   ["Delta", "\u0394"], | ||||
|   ["Epsilon", "\u0395"], | ||||
|   ["Zeta", "\u0396"], | ||||
|   ["Eta", "\u0397"], | ||||
|   ["Theta", "\u0398"], | ||||
|   ["Iota", "\u0399"], | ||||
|   ["Kappa", "\u039A"], | ||||
|   ["Lambda", "\u039B"], | ||||
|   ["Mu", "\u039C"], | ||||
|   ["Nu", "\u039D"], | ||||
|   ["Xi", "\u039E"], | ||||
|   ["Omicron", "\u039F"], | ||||
|   ["Pi", "\u03A0"], | ||||
|   ["Rho", "\u03A1"], | ||||
|   ["Sigma", "\u03A3"], | ||||
|   ["Tau", "\u03A4"], | ||||
|   ["Upsilon", "\u03A5"], | ||||
|   ["Phi", "\u03A6"], | ||||
|   ["Chi", "\u03A7"], | ||||
|   ["Psi", "\u03A8"], | ||||
|   ["Omega", "\u03A9"], | ||||
|   ["alpha", "\u03B1"], | ||||
|   ["beta", "\u03B2"], | ||||
|   ["gamma", "\u03B3"], | ||||
|   ["delta", "\u03B4"], | ||||
|   ["epsilon", "\u03B5"], | ||||
|   ["zeta", "\u03B6"], | ||||
|   ["eta", "\u03B7"], | ||||
|   ["theta", "\u03B8"], | ||||
|   ["iota", "\u03B9"], | ||||
|   ["kappa", "\u03BA"], | ||||
|   ["lambda", "\u03BB"], | ||||
|   ["mu", "\u03BC"], | ||||
|   ["nu", "\u03BD"], | ||||
|   ["xi", "\u03BE"], | ||||
|   ["omicron", "\u03BF"], | ||||
|   ["pi", "\u03C0"], | ||||
|   ["rho", "\u03C1"], | ||||
|   ["sigmaf", "\u03C2"], | ||||
|   ["sigma", "\u03C3"], | ||||
|   ["tau", "\u03C4"], | ||||
|   ["upsilon", "\u03C5"], | ||||
|   ["phi", "\u03C6"], | ||||
|   ["chi", "\u03C7"], | ||||
|   ["psi", "\u03C8"], | ||||
|   ["omega", "\u03C9"], | ||||
|   ["thetasym", "\u03D1"], | ||||
|   ["upsih", "\u03D2"], | ||||
|   ["piv", "\u03D6"], | ||||
|   ["ensp", "\u2002"], | ||||
|   ["emsp", "\u2003"], | ||||
|   ["thinsp", "\u2009"], | ||||
|   ["zwnj", "\u200C"], | ||||
|   ["zwj", "\u200D"], | ||||
|   ["lrm", "\u200E"], | ||||
|   ["rlm", "\u200F"], | ||||
|   ["ndash", "\u2013"], | ||||
|   ["mdash", "\u2014"], | ||||
|   ["lsquo", "\u2018"], | ||||
|   ["rsquo", "\u2019"], | ||||
|   ["sbquo", "\u201A"], | ||||
|   ["ldquo", "\u201C"], | ||||
|   ["rdquo", "\u201D"], | ||||
|   ["bdquo", "\u201E"], | ||||
|   ["dagger", "\u2020"], | ||||
|   ["Dagger", "\u2021"], | ||||
|   ["bull", "\u2022"], | ||||
|   ["hellip", "\u2026"], | ||||
|   ["permil", "\u2030"], | ||||
|   ["prime", "\u2032"], | ||||
|   ["Prime", "\u2033"], | ||||
|   ["lsaquo", "\u2039"], | ||||
|   ["rsaquo", "\u203A"], | ||||
|   ["oline", "\u203E"], | ||||
|   ["frasl", "\u2044"], | ||||
|   ["euro", "\u20AC"], | ||||
|   ["image", "\u2111"], | ||||
|   ["weierp", "\u2118"], | ||||
|   ["real", "\u211C"], | ||||
|   ["trade", "\u2122"], | ||||
|   ["alefsym", "\u2135"], | ||||
|   ["larr", "\u2190"], | ||||
|   ["uarr", "\u2191"], | ||||
|   ["rarr", "\u2192"], | ||||
|   ["darr", "\u2193"], | ||||
|   ["harr", "\u2194"], | ||||
|   ["crarr", "\u21B5"], | ||||
|   ["lArr", "\u21D0"], | ||||
|   ["uArr", "\u21D1"], | ||||
|   ["rArr", "\u21D2"], | ||||
|   ["dArr", "\u21D3"], | ||||
|   ["hArr", "\u21D4"], | ||||
|   ["forall", "\u2200"], | ||||
|   ["part", "\u2202"], | ||||
|   ["exist", "\u2203"], | ||||
|   ["empty", "\u2205"], | ||||
|   ["nabla", "\u2207"], | ||||
|   ["isin", "\u2208"], | ||||
|   ["notin", "\u2209"], | ||||
|   ["ni", "\u220B"], | ||||
|   ["prod", "\u220F"], | ||||
|   ["sum", "\u2211"], | ||||
|   ["minus", "\u2212"], | ||||
|   ["lowast", "\u2217"], | ||||
|   ["radic", "\u221A"], | ||||
|   ["prop", "\u221D"], | ||||
|   ["infin", "\u221E"], | ||||
|   ["ang", "\u2220"], | ||||
|   ["and", "\u2227"], | ||||
|   ["or", "\u2228"], | ||||
|   ["cap", "\u2229"], | ||||
|   ["cup", "\u222A"], | ||||
|   ["int", "\u222B"], | ||||
|   ["there4", "\u2234"], | ||||
|   ["sim", "\u223C"], | ||||
|   ["cong", "\u2245"], | ||||
|   ["asymp", "\u2248"], | ||||
|   ["ne", "\u2260"], | ||||
|   ["equiv", "\u2261"], | ||||
|   ["le", "\u2264"], | ||||
|   ["ge", "\u2265"], | ||||
|   ["sub", "\u2282"], | ||||
|   ["sup", "\u2283"], | ||||
|   ["nsub", "\u2284"], | ||||
|   ["sube", "\u2286"], | ||||
|   ["supe", "\u2287"], | ||||
|   ["oplus", "\u2295"], | ||||
|   ["otimes", "\u2297"], | ||||
|   ["perp", "\u22A5"], | ||||
|   ["sdot", "\u22C5"], | ||||
|   ["lceil", "\u2308"], | ||||
|   ["rceil", "\u2309"], | ||||
|   ["lfloor", "\u230A"], | ||||
|   ["rfloor", "\u230B"], | ||||
|   ["lang", "\u2329"], | ||||
|   ["rang", "\u232A"], | ||||
|   ["loz", "\u25CA"], | ||||
|   ["spades", "\u2660"], | ||||
|   ["clubs", "\u2663"], | ||||
|   ["hearts", "\u2665"], | ||||
|   ["diams", "\u2666"], | ||||
| ]); | ||||
							
								
								
									
										37
									
								
								node_modules/sucrase/dist/parser/plugins/types.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										37
									
								
								node_modules/sucrase/dist/parser/plugins/types.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,37 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true});var _index = require('../tokenizer/index'); | ||||
| var _types = require('../tokenizer/types'); | ||||
| var _base = require('../traverser/base'); | ||||
| var _expression = require('../traverser/expression'); | ||||
| var _flow = require('./flow'); | ||||
| var _typescript = require('./typescript'); | ||||
| 
 | ||||
| /** | ||||
|  * Common parser code for TypeScript and Flow. | ||||
|  */ | ||||
| 
 | ||||
| // An apparent conditional expression could actually be an optional parameter in an arrow function.
 | ||||
|  function typedParseConditional(noIn) { | ||||
|   // If we see ?:, this can't possibly be a valid conditional. typedParseParenItem will be called
 | ||||
|   // later to finish off the arrow parameter. We also need to handle bare ? tokens for optional
 | ||||
|   // parameters without type annotations, i.e. ?, and ?) .
 | ||||
|   if (_index.match.call(void 0, _types.TokenType.question)) { | ||||
|     const nextType = _index.lookaheadType.call(void 0, ); | ||||
|     if (nextType === _types.TokenType.colon || nextType === _types.TokenType.comma || nextType === _types.TokenType.parenR) { | ||||
|       return; | ||||
|     } | ||||
|   } | ||||
|   _expression.baseParseConditional.call(void 0, noIn); | ||||
| } exports.typedParseConditional = typedParseConditional; | ||||
| 
 | ||||
| // Note: These "type casts" are *not* valid TS expressions.
 | ||||
| // But we parse them here and change them when completing the arrow function.
 | ||||
|  function typedParseParenItem() { | ||||
|   _index.eatTypeToken.call(void 0, _types.TokenType.question); | ||||
|   if (_index.match.call(void 0, _types.TokenType.colon)) { | ||||
|     if (_base.isTypeScriptEnabled) { | ||||
|       _typescript.tsParseTypeAnnotation.call(void 0, ); | ||||
|     } else if (_base.isFlowEnabled) { | ||||
|       _flow.flowParseTypeAnnotation.call(void 0, ); | ||||
|     } | ||||
|   } | ||||
| } exports.typedParseParenItem = typedParseParenItem; | ||||
							
								
								
									
										1632
									
								
								node_modules/sucrase/dist/parser/plugins/typescript.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										1632
									
								
								node_modules/sucrase/dist/parser/plugins/typescript.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
										
											
												File diff suppressed because it is too large
												Load diff
											
										
									
								
							
							
								
								
									
										1004
									
								
								node_modules/sucrase/dist/parser/tokenizer/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										1004
									
								
								node_modules/sucrase/dist/parser/tokenizer/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
										
											
												File diff suppressed because it is too large
												Load diff
											
										
									
								
							
							
								
								
									
										43
									
								
								node_modules/sucrase/dist/parser/tokenizer/keywords.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										43
									
								
								node_modules/sucrase/dist/parser/tokenizer/keywords.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,43 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true});var ContextualKeyword; (function (ContextualKeyword) { | ||||
|   const NONE = 0; ContextualKeyword[ContextualKeyword["NONE"] = NONE] = "NONE"; | ||||
|   const _abstract = NONE + 1; ContextualKeyword[ContextualKeyword["_abstract"] = _abstract] = "_abstract"; | ||||
|   const _accessor = _abstract + 1; ContextualKeyword[ContextualKeyword["_accessor"] = _accessor] = "_accessor"; | ||||
|   const _as = _accessor + 1; ContextualKeyword[ContextualKeyword["_as"] = _as] = "_as"; | ||||
|   const _assert = _as + 1; ContextualKeyword[ContextualKeyword["_assert"] = _assert] = "_assert"; | ||||
|   const _asserts = _assert + 1; ContextualKeyword[ContextualKeyword["_asserts"] = _asserts] = "_asserts"; | ||||
|   const _async = _asserts + 1; ContextualKeyword[ContextualKeyword["_async"] = _async] = "_async"; | ||||
|   const _await = _async + 1; ContextualKeyword[ContextualKeyword["_await"] = _await] = "_await"; | ||||
|   const _checks = _await + 1; ContextualKeyword[ContextualKeyword["_checks"] = _checks] = "_checks"; | ||||
|   const _constructor = _checks + 1; ContextualKeyword[ContextualKeyword["_constructor"] = _constructor] = "_constructor"; | ||||
|   const _declare = _constructor + 1; ContextualKeyword[ContextualKeyword["_declare"] = _declare] = "_declare"; | ||||
|   const _enum = _declare + 1; ContextualKeyword[ContextualKeyword["_enum"] = _enum] = "_enum"; | ||||
|   const _exports = _enum + 1; ContextualKeyword[ContextualKeyword["_exports"] = _exports] = "_exports"; | ||||
|   const _from = _exports + 1; ContextualKeyword[ContextualKeyword["_from"] = _from] = "_from"; | ||||
|   const _get = _from + 1; ContextualKeyword[ContextualKeyword["_get"] = _get] = "_get"; | ||||
|   const _global = _get + 1; ContextualKeyword[ContextualKeyword["_global"] = _global] = "_global"; | ||||
|   const _implements = _global + 1; ContextualKeyword[ContextualKeyword["_implements"] = _implements] = "_implements"; | ||||
|   const _infer = _implements + 1; ContextualKeyword[ContextualKeyword["_infer"] = _infer] = "_infer"; | ||||
|   const _interface = _infer + 1; ContextualKeyword[ContextualKeyword["_interface"] = _interface] = "_interface"; | ||||
|   const _is = _interface + 1; ContextualKeyword[ContextualKeyword["_is"] = _is] = "_is"; | ||||
|   const _keyof = _is + 1; ContextualKeyword[ContextualKeyword["_keyof"] = _keyof] = "_keyof"; | ||||
|   const _mixins = _keyof + 1; ContextualKeyword[ContextualKeyword["_mixins"] = _mixins] = "_mixins"; | ||||
|   const _module = _mixins + 1; ContextualKeyword[ContextualKeyword["_module"] = _module] = "_module"; | ||||
|   const _namespace = _module + 1; ContextualKeyword[ContextualKeyword["_namespace"] = _namespace] = "_namespace"; | ||||
|   const _of = _namespace + 1; ContextualKeyword[ContextualKeyword["_of"] = _of] = "_of"; | ||||
|   const _opaque = _of + 1; ContextualKeyword[ContextualKeyword["_opaque"] = _opaque] = "_opaque"; | ||||
|   const _out = _opaque + 1; ContextualKeyword[ContextualKeyword["_out"] = _out] = "_out"; | ||||
|   const _override = _out + 1; ContextualKeyword[ContextualKeyword["_override"] = _override] = "_override"; | ||||
|   const _private = _override + 1; ContextualKeyword[ContextualKeyword["_private"] = _private] = "_private"; | ||||
|   const _protected = _private + 1; ContextualKeyword[ContextualKeyword["_protected"] = _protected] = "_protected"; | ||||
|   const _proto = _protected + 1; ContextualKeyword[ContextualKeyword["_proto"] = _proto] = "_proto"; | ||||
|   const _public = _proto + 1; ContextualKeyword[ContextualKeyword["_public"] = _public] = "_public"; | ||||
|   const _readonly = _public + 1; ContextualKeyword[ContextualKeyword["_readonly"] = _readonly] = "_readonly"; | ||||
|   const _require = _readonly + 1; ContextualKeyword[ContextualKeyword["_require"] = _require] = "_require"; | ||||
|   const _satisfies = _require + 1; ContextualKeyword[ContextualKeyword["_satisfies"] = _satisfies] = "_satisfies"; | ||||
|   const _set = _satisfies + 1; ContextualKeyword[ContextualKeyword["_set"] = _set] = "_set"; | ||||
|   const _static = _set + 1; ContextualKeyword[ContextualKeyword["_static"] = _static] = "_static"; | ||||
|   const _symbol = _static + 1; ContextualKeyword[ContextualKeyword["_symbol"] = _symbol] = "_symbol"; | ||||
|   const _type = _symbol + 1; ContextualKeyword[ContextualKeyword["_type"] = _type] = "_type"; | ||||
|   const _unique = _type + 1; ContextualKeyword[ContextualKeyword["_unique"] = _unique] = "_unique"; | ||||
|   const _using = _unique + 1; ContextualKeyword[ContextualKeyword["_using"] = _using] = "_using"; | ||||
| })(ContextualKeyword || (exports.ContextualKeyword = ContextualKeyword = {})); | ||||
							
								
								
									
										64
									
								
								node_modules/sucrase/dist/parser/tokenizer/readWord.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										64
									
								
								node_modules/sucrase/dist/parser/tokenizer/readWord.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,64 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true});var _base = require('../traverser/base'); | ||||
| var _charcodes = require('../util/charcodes'); | ||||
| var _identifier = require('../util/identifier'); | ||||
| var _index = require('./index'); | ||||
| var _readWordTree = require('./readWordTree'); | ||||
| var _types = require('./types'); | ||||
| 
 | ||||
| /** | ||||
|  * Read an identifier, producing either a name token or matching on one of the existing keywords. | ||||
|  * For performance, we pre-generate big decision tree that we traverse. Each node represents a | ||||
|  * prefix and has 27 values, where the first value is the token or contextual token, if any (-1 if | ||||
|  * not), and the other 26 values are the transitions to other nodes, or -1 to stop. | ||||
|  */ | ||||
|  function readWord() { | ||||
|   let treePos = 0; | ||||
|   let code = 0; | ||||
|   let pos = _base.state.pos; | ||||
|   while (pos < _base.input.length) { | ||||
|     code = _base.input.charCodeAt(pos); | ||||
|     if (code < _charcodes.charCodes.lowercaseA || code > _charcodes.charCodes.lowercaseZ) { | ||||
|       break; | ||||
|     } | ||||
|     const next = _readWordTree.READ_WORD_TREE[treePos + (code - _charcodes.charCodes.lowercaseA) + 1]; | ||||
|     if (next === -1) { | ||||
|       break; | ||||
|     } else { | ||||
|       treePos = next; | ||||
|       pos++; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   const keywordValue = _readWordTree.READ_WORD_TREE[treePos]; | ||||
|   if (keywordValue > -1 && !_identifier.IS_IDENTIFIER_CHAR[code]) { | ||||
|     _base.state.pos = pos; | ||||
|     if (keywordValue & 1) { | ||||
|       _index.finishToken.call(void 0, keywordValue >>> 1); | ||||
|     } else { | ||||
|       _index.finishToken.call(void 0, _types.TokenType.name, keywordValue >>> 1); | ||||
|     } | ||||
|     return; | ||||
|   } | ||||
| 
 | ||||
|   while (pos < _base.input.length) { | ||||
|     const ch = _base.input.charCodeAt(pos); | ||||
|     if (_identifier.IS_IDENTIFIER_CHAR[ch]) { | ||||
|       pos++; | ||||
|     } else if (ch === _charcodes.charCodes.backslash) { | ||||
|       // \u
 | ||||
|       pos += 2; | ||||
|       if (_base.input.charCodeAt(pos) === _charcodes.charCodes.leftCurlyBrace) { | ||||
|         while (pos < _base.input.length && _base.input.charCodeAt(pos) !== _charcodes.charCodes.rightCurlyBrace) { | ||||
|           pos++; | ||||
|         } | ||||
|         pos++; | ||||
|       } | ||||
|     } else if (ch === _charcodes.charCodes.atSign && _base.input.charCodeAt(pos + 1) === _charcodes.charCodes.atSign) { | ||||
|       pos += 2; | ||||
|     } else { | ||||
|       break; | ||||
|     } | ||||
|   } | ||||
|   _base.state.pos = pos; | ||||
|   _index.finishToken.call(void 0, _types.TokenType.name); | ||||
| } exports.default = readWord; | ||||
							
								
								
									
										671
									
								
								node_modules/sucrase/dist/parser/tokenizer/readWordTree.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										671
									
								
								node_modules/sucrase/dist/parser/tokenizer/readWordTree.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,671 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true});// Generated file, do not edit! Run "yarn generate" to re-generate this file.
 | ||||
| var _keywords = require('./keywords'); | ||||
| var _types = require('./types'); | ||||
| 
 | ||||
| // prettier-ignore
 | ||||
|  const READ_WORD_TREE = new Int32Array([ | ||||
|   // ""
 | ||||
|   -1, 27, 783, 918, 1755, 2376, 2862, 3483, -1, 3699, -1, 4617, 4752, 4833, 5130, 5508, 5940, -1, 6480, 6939, 7749, 8181, 8451, 8613, -1, 8829, -1, | ||||
|   // "a"
 | ||||
|   -1, -1, 54, 243, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 432, -1, -1, -1, 675, -1, -1, -1, | ||||
|   // "ab"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 81, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "abs"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 108, -1, -1, -1, -1, -1, -1, | ||||
|   // "abst"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 135, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "abstr"
 | ||||
|   -1, 162, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "abstra"
 | ||||
|   -1, -1, -1, 189, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "abstrac"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 216, -1, -1, -1, -1, -1, -1, | ||||
|   // "abstract"
 | ||||
|   _keywords.ContextualKeyword._abstract << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ac"
 | ||||
|   -1, -1, -1, 270, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "acc"
 | ||||
|   -1, -1, -1, -1, -1, 297, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "acce"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 324, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "acces"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 351, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "access"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 378, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "accesso"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 405, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "accessor"
 | ||||
|   _keywords.ContextualKeyword._accessor << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "as"
 | ||||
|   _keywords.ContextualKeyword._as << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 459, -1, -1, -1, -1, -1, 594, -1, | ||||
|   // "ass"
 | ||||
|   -1, -1, -1, -1, -1, 486, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "asse"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 513, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "asser"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 540, -1, -1, -1, -1, -1, -1, | ||||
|   // "assert"
 | ||||
|   _keywords.ContextualKeyword._assert << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 567, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "asserts"
 | ||||
|   _keywords.ContextualKeyword._asserts << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "asy"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 621, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "asyn"
 | ||||
|   -1, -1, -1, 648, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "async"
 | ||||
|   _keywords.ContextualKeyword._async << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "aw"
 | ||||
|   -1, 702, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "awa"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 729, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "awai"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 756, -1, -1, -1, -1, -1, -1, | ||||
|   // "await"
 | ||||
|   _keywords.ContextualKeyword._await << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "b"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 810, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "br"
 | ||||
|   -1, -1, -1, -1, -1, 837, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "bre"
 | ||||
|   -1, 864, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "brea"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 891, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "break"
 | ||||
|   (_types.TokenType._break << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "c"
 | ||||
|   -1, 945, -1, -1, -1, -1, -1, -1, 1107, -1, -1, -1, 1242, -1, -1, 1350, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ca"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 972, 1026, -1, -1, -1, -1, -1, -1, | ||||
|   // "cas"
 | ||||
|   -1, -1, -1, -1, -1, 999, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "case"
 | ||||
|   (_types.TokenType._case << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "cat"
 | ||||
|   -1, -1, -1, 1053, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "catc"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, 1080, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "catch"
 | ||||
|   (_types.TokenType._catch << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ch"
 | ||||
|   -1, -1, -1, -1, -1, 1134, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "che"
 | ||||
|   -1, -1, -1, 1161, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "chec"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1188, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "check"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1215, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "checks"
 | ||||
|   _keywords.ContextualKeyword._checks << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "cl"
 | ||||
|   -1, 1269, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "cla"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1296, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "clas"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1323, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "class"
 | ||||
|   (_types.TokenType._class << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "co"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1377, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "con"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1404, 1620, -1, -1, -1, -1, -1, -1, | ||||
|   // "cons"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1431, -1, -1, -1, -1, -1, -1, | ||||
|   // "const"
 | ||||
|   (_types.TokenType._const << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1458, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "constr"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1485, -1, -1, -1, -1, -1, | ||||
|   // "constru"
 | ||||
|   -1, -1, -1, 1512, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "construc"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1539, -1, -1, -1, -1, -1, -1, | ||||
|   // "construct"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1566, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "constructo"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1593, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "constructor"
 | ||||
|   _keywords.ContextualKeyword._constructor << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "cont"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 1647, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "conti"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1674, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "contin"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1701, -1, -1, -1, -1, -1, | ||||
|   // "continu"
 | ||||
|   -1, -1, -1, -1, -1, 1728, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "continue"
 | ||||
|   (_types.TokenType._continue << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "d"
 | ||||
|   -1, -1, -1, -1, -1, 1782, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2349, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "de"
 | ||||
|   -1, -1, 1809, 1971, -1, -1, 2106, -1, -1, -1, -1, -1, 2241, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "deb"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1836, -1, -1, -1, -1, -1, | ||||
|   // "debu"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, 1863, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "debug"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, 1890, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "debugg"
 | ||||
|   -1, -1, -1, -1, -1, 1917, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "debugge"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1944, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "debugger"
 | ||||
|   (_types.TokenType._debugger << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "dec"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 1998, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "decl"
 | ||||
|   -1, 2025, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "decla"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2052, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "declar"
 | ||||
|   -1, -1, -1, -1, -1, 2079, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "declare"
 | ||||
|   _keywords.ContextualKeyword._declare << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "def"
 | ||||
|   -1, 2133, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "defa"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2160, -1, -1, -1, -1, -1, | ||||
|   // "defau"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2187, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "defaul"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2214, -1, -1, -1, -1, -1, -1, | ||||
|   // "default"
 | ||||
|   (_types.TokenType._default << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "del"
 | ||||
|   -1, -1, -1, -1, -1, 2268, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "dele"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2295, -1, -1, -1, -1, -1, -1, | ||||
|   // "delet"
 | ||||
|   -1, -1, -1, -1, -1, 2322, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "delete"
 | ||||
|   (_types.TokenType._delete << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "do"
 | ||||
|   (_types.TokenType._do << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "e"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2403, -1, 2484, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2565, -1, -1, | ||||
|   // "el"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2430, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "els"
 | ||||
|   -1, -1, -1, -1, -1, 2457, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "else"
 | ||||
|   (_types.TokenType._else << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "en"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2511, -1, -1, -1, -1, -1, | ||||
|   // "enu"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2538, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "enum"
 | ||||
|   _keywords.ContextualKeyword._enum << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ex"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2592, -1, -1, -1, 2727, -1, -1, -1, -1, -1, -1, | ||||
|   // "exp"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2619, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "expo"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2646, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "expor"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2673, -1, -1, -1, -1, -1, -1, | ||||
|   // "export"
 | ||||
|   (_types.TokenType._export << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2700, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "exports"
 | ||||
|   _keywords.ContextualKeyword._exports << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ext"
 | ||||
|   -1, -1, -1, -1, -1, 2754, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "exte"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2781, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "exten"
 | ||||
|   -1, -1, -1, -1, 2808, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "extend"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2835, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "extends"
 | ||||
|   (_types.TokenType._extends << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "f"
 | ||||
|   -1, 2889, -1, -1, -1, -1, -1, -1, -1, 2997, -1, -1, -1, -1, -1, 3159, -1, -1, 3213, -1, -1, 3294, -1, -1, -1, -1, -1, | ||||
|   // "fa"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2916, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fal"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 2943, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fals"
 | ||||
|   -1, -1, -1, -1, -1, 2970, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "false"
 | ||||
|   (_types.TokenType._false << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3024, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fin"
 | ||||
|   -1, 3051, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fina"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3078, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "final"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3105, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "finall"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3132, -1, | ||||
|   // "finally"
 | ||||
|   (_types.TokenType._finally << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fo"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3186, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "for"
 | ||||
|   (_types.TokenType._for << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fr"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3240, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fro"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3267, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "from"
 | ||||
|   _keywords.ContextualKeyword._from << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fu"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3321, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "fun"
 | ||||
|   -1, -1, -1, 3348, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "func"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3375, -1, -1, -1, -1, -1, -1, | ||||
|   // "funct"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 3402, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "functi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3429, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "functio"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3456, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "function"
 | ||||
|   (_types.TokenType._function << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "g"
 | ||||
|   -1, -1, -1, -1, -1, 3510, -1, -1, -1, -1, -1, -1, 3564, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ge"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3537, -1, -1, -1, -1, -1, -1, | ||||
|   // "get"
 | ||||
|   _keywords.ContextualKeyword._get << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "gl"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3591, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "glo"
 | ||||
|   -1, -1, 3618, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "glob"
 | ||||
|   -1, 3645, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "globa"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3672, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "global"
 | ||||
|   _keywords.ContextualKeyword._global << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "i"
 | ||||
|   -1, -1, -1, -1, -1, -1, 3726, -1, -1, -1, -1, -1, -1, 3753, 4077, -1, -1, -1, -1, 4590, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "if"
 | ||||
|   (_types.TokenType._if << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "im"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3780, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "imp"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3807, -1, -1, 3996, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "impl"
 | ||||
|   -1, -1, -1, -1, -1, 3834, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "imple"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3861, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "implem"
 | ||||
|   -1, -1, -1, -1, -1, 3888, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "impleme"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3915, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "implemen"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3942, -1, -1, -1, -1, -1, -1, | ||||
|   // "implement"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 3969, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "implements"
 | ||||
|   _keywords.ContextualKeyword._implements << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "impo"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4023, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "impor"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4050, -1, -1, -1, -1, -1, -1, | ||||
|   // "import"
 | ||||
|   (_types.TokenType._import << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "in"
 | ||||
|   (_types.TokenType._in << 1) + 1, -1, -1, -1, -1, -1, 4104, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4185, 4401, -1, -1, -1, -1, -1, -1, | ||||
|   // "inf"
 | ||||
|   -1, -1, -1, -1, -1, 4131, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "infe"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4158, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "infer"
 | ||||
|   _keywords.ContextualKeyword._infer << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ins"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4212, -1, -1, -1, -1, -1, -1, | ||||
|   // "inst"
 | ||||
|   -1, 4239, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "insta"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4266, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "instan"
 | ||||
|   -1, -1, -1, 4293, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "instanc"
 | ||||
|   -1, -1, -1, -1, -1, 4320, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "instance"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4347, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "instanceo"
 | ||||
|   -1, -1, -1, -1, -1, -1, 4374, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "instanceof"
 | ||||
|   (_types.TokenType._instanceof << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "int"
 | ||||
|   -1, -1, -1, -1, -1, 4428, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "inte"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4455, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "inter"
 | ||||
|   -1, -1, -1, -1, -1, -1, 4482, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "interf"
 | ||||
|   -1, 4509, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "interfa"
 | ||||
|   -1, -1, -1, 4536, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "interfac"
 | ||||
|   -1, -1, -1, -1, -1, 4563, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "interface"
 | ||||
|   _keywords.ContextualKeyword._interface << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "is"
 | ||||
|   _keywords.ContextualKeyword._is << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "k"
 | ||||
|   -1, -1, -1, -1, -1, 4644, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ke"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4671, -1, | ||||
|   // "key"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4698, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "keyo"
 | ||||
|   -1, -1, -1, -1, -1, -1, 4725, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "keyof"
 | ||||
|   _keywords.ContextualKeyword._keyof << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "l"
 | ||||
|   -1, -1, -1, -1, -1, 4779, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "le"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4806, -1, -1, -1, -1, -1, -1, | ||||
|   // "let"
 | ||||
|   (_types.TokenType._let << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "m"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 4860, -1, -1, -1, -1, -1, 4995, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "mi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4887, -1, -1, | ||||
|   // "mix"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 4914, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "mixi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4941, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "mixin"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 4968, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "mixins"
 | ||||
|   _keywords.ContextualKeyword._mixins << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "mo"
 | ||||
|   -1, -1, -1, -1, 5022, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "mod"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5049, -1, -1, -1, -1, -1, | ||||
|   // "modu"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5076, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "modul"
 | ||||
|   -1, -1, -1, -1, -1, 5103, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "module"
 | ||||
|   _keywords.ContextualKeyword._module << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "n"
 | ||||
|   -1, 5157, -1, -1, -1, 5373, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5427, -1, -1, -1, -1, -1, | ||||
|   // "na"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5184, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "nam"
 | ||||
|   -1, -1, -1, -1, -1, 5211, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "name"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5238, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "names"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5265, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "namesp"
 | ||||
|   -1, 5292, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "namespa"
 | ||||
|   -1, -1, -1, 5319, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "namespac"
 | ||||
|   -1, -1, -1, -1, -1, 5346, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "namespace"
 | ||||
|   _keywords.ContextualKeyword._namespace << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ne"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5400, -1, -1, -1, | ||||
|   // "new"
 | ||||
|   (_types.TokenType._new << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "nu"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5454, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "nul"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5481, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "null"
 | ||||
|   (_types.TokenType._null << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "o"
 | ||||
|   -1, -1, -1, -1, -1, -1, 5535, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5562, -1, -1, -1, -1, 5697, 5751, -1, -1, -1, -1, | ||||
|   // "of"
 | ||||
|   _keywords.ContextualKeyword._of << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "op"
 | ||||
|   -1, 5589, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "opa"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5616, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "opaq"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5643, -1, -1, -1, -1, -1, | ||||
|   // "opaqu"
 | ||||
|   -1, -1, -1, -1, -1, 5670, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "opaque"
 | ||||
|   _keywords.ContextualKeyword._opaque << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ou"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5724, -1, -1, -1, -1, -1, -1, | ||||
|   // "out"
 | ||||
|   _keywords.ContextualKeyword._out << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ov"
 | ||||
|   -1, -1, -1, -1, -1, 5778, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ove"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5805, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "over"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5832, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "overr"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 5859, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "overri"
 | ||||
|   -1, -1, -1, -1, 5886, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "overrid"
 | ||||
|   -1, -1, -1, -1, -1, 5913, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "override"
 | ||||
|   _keywords.ContextualKeyword._override << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "p"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 5967, -1, -1, 6345, -1, -1, -1, -1, -1, | ||||
|   // "pr"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 5994, -1, -1, -1, -1, -1, 6129, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "pri"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6021, -1, -1, -1, -1, | ||||
|   // "priv"
 | ||||
|   -1, 6048, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "priva"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6075, -1, -1, -1, -1, -1, -1, | ||||
|   // "privat"
 | ||||
|   -1, -1, -1, -1, -1, 6102, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "private"
 | ||||
|   _keywords.ContextualKeyword._private << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "pro"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6156, -1, -1, -1, -1, -1, -1, | ||||
|   // "prot"
 | ||||
|   -1, -1, -1, -1, -1, 6183, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6318, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "prote"
 | ||||
|   -1, -1, -1, 6210, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "protec"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6237, -1, -1, -1, -1, -1, -1, | ||||
|   // "protect"
 | ||||
|   -1, -1, -1, -1, -1, 6264, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "protecte"
 | ||||
|   -1, -1, -1, -1, 6291, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "protected"
 | ||||
|   _keywords.ContextualKeyword._protected << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "proto"
 | ||||
|   _keywords.ContextualKeyword._proto << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "pu"
 | ||||
|   -1, -1, 6372, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "pub"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6399, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "publ"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 6426, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "publi"
 | ||||
|   -1, -1, -1, 6453, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "public"
 | ||||
|   _keywords.ContextualKeyword._public << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "r"
 | ||||
|   -1, -1, -1, -1, -1, 6507, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "re"
 | ||||
|   -1, 6534, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6696, -1, -1, 6831, -1, -1, -1, -1, -1, -1, | ||||
|   // "rea"
 | ||||
|   -1, -1, -1, -1, 6561, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "read"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6588, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "reado"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6615, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "readon"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6642, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "readonl"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6669, -1, | ||||
|   // "readonly"
 | ||||
|   _keywords.ContextualKeyword._readonly << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "req"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6723, -1, -1, -1, -1, -1, | ||||
|   // "requ"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 6750, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "requi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6777, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "requir"
 | ||||
|   -1, -1, -1, -1, -1, 6804, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "require"
 | ||||
|   _keywords.ContextualKeyword._require << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ret"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6858, -1, -1, -1, -1, -1, | ||||
|   // "retu"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6885, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "retur"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6912, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "return"
 | ||||
|   (_types.TokenType._return << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "s"
 | ||||
|   -1, 6966, -1, -1, -1, 7182, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7236, 7371, -1, 7479, -1, 7614, -1, | ||||
|   // "sa"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 6993, -1, -1, -1, -1, -1, -1, | ||||
|   // "sat"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 7020, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "sati"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7047, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "satis"
 | ||||
|   -1, -1, -1, -1, -1, -1, 7074, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "satisf"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 7101, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "satisfi"
 | ||||
|   -1, -1, -1, -1, -1, 7128, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "satisfie"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7155, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "satisfies"
 | ||||
|   _keywords.ContextualKeyword._satisfies << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "se"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7209, -1, -1, -1, -1, -1, -1, | ||||
|   // "set"
 | ||||
|   _keywords.ContextualKeyword._set << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "st"
 | ||||
|   -1, 7263, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "sta"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7290, -1, -1, -1, -1, -1, -1, | ||||
|   // "stat"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 7317, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "stati"
 | ||||
|   -1, -1, -1, 7344, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "static"
 | ||||
|   _keywords.ContextualKeyword._static << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "su"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7398, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "sup"
 | ||||
|   -1, -1, -1, -1, -1, 7425, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "supe"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7452, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "super"
 | ||||
|   (_types.TokenType._super << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "sw"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 7506, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "swi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7533, -1, -1, -1, -1, -1, -1, | ||||
|   // "swit"
 | ||||
|   -1, -1, -1, 7560, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "switc"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, 7587, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "switch"
 | ||||
|   (_types.TokenType._switch << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "sy"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7641, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "sym"
 | ||||
|   -1, -1, 7668, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "symb"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7695, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "symbo"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7722, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "symbol"
 | ||||
|   _keywords.ContextualKeyword._symbol << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "t"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, 7776, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7938, -1, -1, -1, -1, -1, -1, 8046, -1, | ||||
|   // "th"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 7803, -1, -1, -1, -1, -1, -1, -1, -1, 7857, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "thi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7830, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "this"
 | ||||
|   (_types.TokenType._this << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "thr"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7884, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "thro"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7911, -1, -1, -1, | ||||
|   // "throw"
 | ||||
|   (_types.TokenType._throw << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "tr"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 7965, -1, -1, -1, 8019, -1, | ||||
|   // "tru"
 | ||||
|   -1, -1, -1, -1, -1, 7992, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "true"
 | ||||
|   (_types.TokenType._true << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "try"
 | ||||
|   (_types.TokenType._try << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "ty"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8073, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "typ"
 | ||||
|   -1, -1, -1, -1, -1, 8100, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "type"
 | ||||
|   _keywords.ContextualKeyword._type << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8127, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "typeo"
 | ||||
|   -1, -1, -1, -1, -1, -1, 8154, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "typeof"
 | ||||
|   (_types.TokenType._typeof << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "u"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8208, -1, -1, -1, -1, 8343, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "un"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 8235, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "uni"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8262, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "uniq"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8289, -1, -1, -1, -1, -1, | ||||
|   // "uniqu"
 | ||||
|   -1, -1, -1, -1, -1, 8316, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "unique"
 | ||||
|   _keywords.ContextualKeyword._unique << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "us"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 8370, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "usi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8397, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "usin"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, 8424, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "using"
 | ||||
|   _keywords.ContextualKeyword._using << 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "v"
 | ||||
|   -1, 8478, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8532, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "va"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8505, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "var"
 | ||||
|   (_types.TokenType._var << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "vo"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 8559, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "voi"
 | ||||
|   -1, -1, -1, -1, 8586, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "void"
 | ||||
|   (_types.TokenType._void << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "w"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, 8640, 8748, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "wh"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 8667, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "whi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8694, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "whil"
 | ||||
|   -1, -1, -1, -1, -1, 8721, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "while"
 | ||||
|   (_types.TokenType._while << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "wi"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8775, -1, -1, -1, -1, -1, -1, | ||||
|   // "wit"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, 8802, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "with"
 | ||||
|   (_types.TokenType._with << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "y"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, 8856, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "yi"
 | ||||
|   -1, -1, -1, -1, -1, 8883, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "yie"
 | ||||
|   -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, 8910, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "yiel"
 | ||||
|   -1, -1, -1, -1, 8937, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
|   // "yield"
 | ||||
|   (_types.TokenType._yield << 1) + 1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, -1, | ||||
| ]); exports.READ_WORD_TREE = READ_WORD_TREE; | ||||
							
								
								
									
										106
									
								
								node_modules/sucrase/dist/parser/tokenizer/state.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										106
									
								
								node_modules/sucrase/dist/parser/tokenizer/state.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,106 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); | ||||
| var _keywords = require('./keywords'); | ||||
| var _types = require('./types'); | ||||
| 
 | ||||
|  class Scope { | ||||
|    | ||||
|    | ||||
|    | ||||
| 
 | ||||
|   constructor(startTokenIndex, endTokenIndex, isFunctionScope) { | ||||
|     this.startTokenIndex = startTokenIndex; | ||||
|     this.endTokenIndex = endTokenIndex; | ||||
|     this.isFunctionScope = isFunctionScope; | ||||
|   } | ||||
| } exports.Scope = Scope; | ||||
| 
 | ||||
|  class StateSnapshot { | ||||
|   constructor( | ||||
|      potentialArrowAt, | ||||
|      noAnonFunctionType, | ||||
|      inDisallowConditionalTypesContext, | ||||
|      tokensLength, | ||||
|      scopesLength, | ||||
|      pos, | ||||
|      type, | ||||
|      contextualKeyword, | ||||
|      start, | ||||
|      end, | ||||
|      isType, | ||||
|      scopeDepth, | ||||
|      error, | ||||
|   ) {;this.potentialArrowAt = potentialArrowAt;this.noAnonFunctionType = noAnonFunctionType;this.inDisallowConditionalTypesContext = inDisallowConditionalTypesContext;this.tokensLength = tokensLength;this.scopesLength = scopesLength;this.pos = pos;this.type = type;this.contextualKeyword = contextualKeyword;this.start = start;this.end = end;this.isType = isType;this.scopeDepth = scopeDepth;this.error = error;} | ||||
| } exports.StateSnapshot = StateSnapshot; | ||||
| 
 | ||||
|  class State {constructor() { State.prototype.__init.call(this);State.prototype.__init2.call(this);State.prototype.__init3.call(this);State.prototype.__init4.call(this);State.prototype.__init5.call(this);State.prototype.__init6.call(this);State.prototype.__init7.call(this);State.prototype.__init8.call(this);State.prototype.__init9.call(this);State.prototype.__init10.call(this);State.prototype.__init11.call(this);State.prototype.__init12.call(this);State.prototype.__init13.call(this); } | ||||
|   // Used to signify the start of a potential arrow function
 | ||||
|   __init() {this.potentialArrowAt = -1} | ||||
| 
 | ||||
|   // Used by Flow to handle an edge case involving function type parsing.
 | ||||
|   __init2() {this.noAnonFunctionType = false} | ||||
| 
 | ||||
|   // Used by TypeScript to handle ambiguities when parsing conditional types.
 | ||||
|   __init3() {this.inDisallowConditionalTypesContext = false} | ||||
| 
 | ||||
|   // Token store.
 | ||||
|   __init4() {this.tokens = []} | ||||
| 
 | ||||
|   // Array of all observed scopes, ordered by their ending position.
 | ||||
|   __init5() {this.scopes = []} | ||||
| 
 | ||||
|   // The current position of the tokenizer in the input.
 | ||||
|   __init6() {this.pos = 0} | ||||
| 
 | ||||
|   // Information about the current token.
 | ||||
|   __init7() {this.type = _types.TokenType.eof} | ||||
|   __init8() {this.contextualKeyword = _keywords.ContextualKeyword.NONE} | ||||
|   __init9() {this.start = 0} | ||||
|   __init10() {this.end = 0} | ||||
| 
 | ||||
|   __init11() {this.isType = false} | ||||
|   __init12() {this.scopeDepth = 0} | ||||
| 
 | ||||
|   /** | ||||
|    * If the parser is in an error state, then the token is always tt.eof and all functions can | ||||
|    * keep executing but should be written so they don't get into an infinite loop in this situation. | ||||
|    * | ||||
|    * This approach, combined with the ability to snapshot and restore state, allows us to implement | ||||
|    * backtracking without exceptions and without needing to explicitly propagate error states | ||||
|    * everywhere. | ||||
|    */ | ||||
|   __init13() {this.error = null} | ||||
| 
 | ||||
|   snapshot() { | ||||
|     return new StateSnapshot( | ||||
|       this.potentialArrowAt, | ||||
|       this.noAnonFunctionType, | ||||
|       this.inDisallowConditionalTypesContext, | ||||
|       this.tokens.length, | ||||
|       this.scopes.length, | ||||
|       this.pos, | ||||
|       this.type, | ||||
|       this.contextualKeyword, | ||||
|       this.start, | ||||
|       this.end, | ||||
|       this.isType, | ||||
|       this.scopeDepth, | ||||
|       this.error, | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|   restoreFromSnapshot(snapshot) { | ||||
|     this.potentialArrowAt = snapshot.potentialArrowAt; | ||||
|     this.noAnonFunctionType = snapshot.noAnonFunctionType; | ||||
|     this.inDisallowConditionalTypesContext = snapshot.inDisallowConditionalTypesContext; | ||||
|     this.tokens.length = snapshot.tokensLength; | ||||
|     this.scopes.length = snapshot.scopesLength; | ||||
|     this.pos = snapshot.pos; | ||||
|     this.type = snapshot.type; | ||||
|     this.contextualKeyword = snapshot.contextualKeyword; | ||||
|     this.start = snapshot.start; | ||||
|     this.end = snapshot.end; | ||||
|     this.isType = snapshot.isType; | ||||
|     this.scopeDepth = snapshot.scopeDepth; | ||||
|     this.error = snapshot.error; | ||||
|   } | ||||
| } exports.default = State; | ||||
							
								
								
									
										361
									
								
								node_modules/sucrase/dist/parser/tokenizer/types.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										361
									
								
								node_modules/sucrase/dist/parser/tokenizer/types.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,361 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true});// Generated file, do not edit! Run "yarn generate" to re-generate this file.
 | ||||
| /* istanbul ignore file */ | ||||
| /** | ||||
|  * Enum of all token types, with bit fields to signify meaningful properties. | ||||
|  */ | ||||
| var TokenType; (function (TokenType) { | ||||
|   // Precedence 0 means not an operator; otherwise it is a positive number up to 12.
 | ||||
|   const PRECEDENCE_MASK = 0xf; TokenType[TokenType["PRECEDENCE_MASK"] = PRECEDENCE_MASK] = "PRECEDENCE_MASK"; | ||||
|   const IS_KEYWORD = 1 << 4; TokenType[TokenType["IS_KEYWORD"] = IS_KEYWORD] = "IS_KEYWORD"; | ||||
|   const IS_ASSIGN = 1 << 5; TokenType[TokenType["IS_ASSIGN"] = IS_ASSIGN] = "IS_ASSIGN"; | ||||
|   const IS_RIGHT_ASSOCIATIVE = 1 << 6; TokenType[TokenType["IS_RIGHT_ASSOCIATIVE"] = IS_RIGHT_ASSOCIATIVE] = "IS_RIGHT_ASSOCIATIVE"; | ||||
|   const IS_PREFIX = 1 << 7; TokenType[TokenType["IS_PREFIX"] = IS_PREFIX] = "IS_PREFIX"; | ||||
|   const IS_POSTFIX = 1 << 8; TokenType[TokenType["IS_POSTFIX"] = IS_POSTFIX] = "IS_POSTFIX"; | ||||
|   const IS_EXPRESSION_START = 1 << 9; TokenType[TokenType["IS_EXPRESSION_START"] = IS_EXPRESSION_START] = "IS_EXPRESSION_START"; | ||||
| 
 | ||||
|   const num = 512; TokenType[TokenType["num"] = num] = "num"; // num startsExpr
 | ||||
|   const bigint = 1536; TokenType[TokenType["bigint"] = bigint] = "bigint"; // bigint startsExpr
 | ||||
|   const decimal = 2560; TokenType[TokenType["decimal"] = decimal] = "decimal"; // decimal startsExpr
 | ||||
|   const regexp = 3584; TokenType[TokenType["regexp"] = regexp] = "regexp"; // regexp startsExpr
 | ||||
|   const string = 4608; TokenType[TokenType["string"] = string] = "string"; // string startsExpr
 | ||||
|   const name = 5632; TokenType[TokenType["name"] = name] = "name"; // name startsExpr
 | ||||
|   const eof = 6144; TokenType[TokenType["eof"] = eof] = "eof"; // eof
 | ||||
|   const bracketL = 7680; TokenType[TokenType["bracketL"] = bracketL] = "bracketL"; // [ startsExpr
 | ||||
|   const bracketR = 8192; TokenType[TokenType["bracketR"] = bracketR] = "bracketR"; // ]
 | ||||
|   const braceL = 9728; TokenType[TokenType["braceL"] = braceL] = "braceL"; // { startsExpr
 | ||||
|   const braceBarL = 10752; TokenType[TokenType["braceBarL"] = braceBarL] = "braceBarL"; // {| startsExpr
 | ||||
|   const braceR = 11264; TokenType[TokenType["braceR"] = braceR] = "braceR"; // }
 | ||||
|   const braceBarR = 12288; TokenType[TokenType["braceBarR"] = braceBarR] = "braceBarR"; // |}
 | ||||
|   const parenL = 13824; TokenType[TokenType["parenL"] = parenL] = "parenL"; // ( startsExpr
 | ||||
|   const parenR = 14336; TokenType[TokenType["parenR"] = parenR] = "parenR"; // )
 | ||||
|   const comma = 15360; TokenType[TokenType["comma"] = comma] = "comma"; // ,
 | ||||
|   const semi = 16384; TokenType[TokenType["semi"] = semi] = "semi"; // ;
 | ||||
|   const colon = 17408; TokenType[TokenType["colon"] = colon] = "colon"; // :
 | ||||
|   const doubleColon = 18432; TokenType[TokenType["doubleColon"] = doubleColon] = "doubleColon"; // ::
 | ||||
|   const dot = 19456; TokenType[TokenType["dot"] = dot] = "dot"; // .
 | ||||
|   const question = 20480; TokenType[TokenType["question"] = question] = "question"; // ?
 | ||||
|   const questionDot = 21504; TokenType[TokenType["questionDot"] = questionDot] = "questionDot"; // ?.
 | ||||
|   const arrow = 22528; TokenType[TokenType["arrow"] = arrow] = "arrow"; // =>
 | ||||
|   const template = 23552; TokenType[TokenType["template"] = template] = "template"; // template
 | ||||
|   const ellipsis = 24576; TokenType[TokenType["ellipsis"] = ellipsis] = "ellipsis"; // ...
 | ||||
|   const backQuote = 25600; TokenType[TokenType["backQuote"] = backQuote] = "backQuote"; // `
 | ||||
|   const dollarBraceL = 27136; TokenType[TokenType["dollarBraceL"] = dollarBraceL] = "dollarBraceL"; // ${ startsExpr
 | ||||
|   const at = 27648; TokenType[TokenType["at"] = at] = "at"; // @
 | ||||
|   const hash = 29184; TokenType[TokenType["hash"] = hash] = "hash"; // # startsExpr
 | ||||
|   const eq = 29728; TokenType[TokenType["eq"] = eq] = "eq"; // = isAssign
 | ||||
|   const assign = 30752; TokenType[TokenType["assign"] = assign] = "assign"; // _= isAssign
 | ||||
|   const preIncDec = 32640; TokenType[TokenType["preIncDec"] = preIncDec] = "preIncDec"; // ++/-- prefix postfix startsExpr
 | ||||
|   const postIncDec = 33664; TokenType[TokenType["postIncDec"] = postIncDec] = "postIncDec"; // ++/-- prefix postfix startsExpr
 | ||||
|   const bang = 34432; TokenType[TokenType["bang"] = bang] = "bang"; // ! prefix startsExpr
 | ||||
|   const tilde = 35456; TokenType[TokenType["tilde"] = tilde] = "tilde"; // ~ prefix startsExpr
 | ||||
|   const pipeline = 35841; TokenType[TokenType["pipeline"] = pipeline] = "pipeline"; // |> prec:1
 | ||||
|   const nullishCoalescing = 36866; TokenType[TokenType["nullishCoalescing"] = nullishCoalescing] = "nullishCoalescing"; // ?? prec:2
 | ||||
|   const logicalOR = 37890; TokenType[TokenType["logicalOR"] = logicalOR] = "logicalOR"; // || prec:2
 | ||||
|   const logicalAND = 38915; TokenType[TokenType["logicalAND"] = logicalAND] = "logicalAND"; // && prec:3
 | ||||
|   const bitwiseOR = 39940; TokenType[TokenType["bitwiseOR"] = bitwiseOR] = "bitwiseOR"; // | prec:4
 | ||||
|   const bitwiseXOR = 40965; TokenType[TokenType["bitwiseXOR"] = bitwiseXOR] = "bitwiseXOR"; // ^ prec:5
 | ||||
|   const bitwiseAND = 41990; TokenType[TokenType["bitwiseAND"] = bitwiseAND] = "bitwiseAND"; // & prec:6
 | ||||
|   const equality = 43015; TokenType[TokenType["equality"] = equality] = "equality"; // ==/!= prec:7
 | ||||
|   const lessThan = 44040; TokenType[TokenType["lessThan"] = lessThan] = "lessThan"; // < prec:8
 | ||||
|   const greaterThan = 45064; TokenType[TokenType["greaterThan"] = greaterThan] = "greaterThan"; // > prec:8
 | ||||
|   const relationalOrEqual = 46088; TokenType[TokenType["relationalOrEqual"] = relationalOrEqual] = "relationalOrEqual"; // <=/>= prec:8
 | ||||
|   const bitShiftL = 47113; TokenType[TokenType["bitShiftL"] = bitShiftL] = "bitShiftL"; // << prec:9
 | ||||
|   const bitShiftR = 48137; TokenType[TokenType["bitShiftR"] = bitShiftR] = "bitShiftR"; // >>/>>> prec:9
 | ||||
|   const plus = 49802; TokenType[TokenType["plus"] = plus] = "plus"; // + prec:10 prefix startsExpr
 | ||||
|   const minus = 50826; TokenType[TokenType["minus"] = minus] = "minus"; // - prec:10 prefix startsExpr
 | ||||
|   const modulo = 51723; TokenType[TokenType["modulo"] = modulo] = "modulo"; // % prec:11 startsExpr
 | ||||
|   const star = 52235; TokenType[TokenType["star"] = star] = "star"; // * prec:11
 | ||||
|   const slash = 53259; TokenType[TokenType["slash"] = slash] = "slash"; // / prec:11
 | ||||
|   const exponent = 54348; TokenType[TokenType["exponent"] = exponent] = "exponent"; // ** prec:12 rightAssociative
 | ||||
|   const jsxName = 55296; TokenType[TokenType["jsxName"] = jsxName] = "jsxName"; // jsxName
 | ||||
|   const jsxText = 56320; TokenType[TokenType["jsxText"] = jsxText] = "jsxText"; // jsxText
 | ||||
|   const jsxEmptyText = 57344; TokenType[TokenType["jsxEmptyText"] = jsxEmptyText] = "jsxEmptyText"; // jsxEmptyText
 | ||||
|   const jsxTagStart = 58880; TokenType[TokenType["jsxTagStart"] = jsxTagStart] = "jsxTagStart"; // jsxTagStart startsExpr
 | ||||
|   const jsxTagEnd = 59392; TokenType[TokenType["jsxTagEnd"] = jsxTagEnd] = "jsxTagEnd"; // jsxTagEnd
 | ||||
|   const typeParameterStart = 60928; TokenType[TokenType["typeParameterStart"] = typeParameterStart] = "typeParameterStart"; // typeParameterStart startsExpr
 | ||||
|   const nonNullAssertion = 61440; TokenType[TokenType["nonNullAssertion"] = nonNullAssertion] = "nonNullAssertion"; // nonNullAssertion
 | ||||
|   const _break = 62480; TokenType[TokenType["_break"] = _break] = "_break"; // break keyword
 | ||||
|   const _case = 63504; TokenType[TokenType["_case"] = _case] = "_case"; // case keyword
 | ||||
|   const _catch = 64528; TokenType[TokenType["_catch"] = _catch] = "_catch"; // catch keyword
 | ||||
|   const _continue = 65552; TokenType[TokenType["_continue"] = _continue] = "_continue"; // continue keyword
 | ||||
|   const _debugger = 66576; TokenType[TokenType["_debugger"] = _debugger] = "_debugger"; // debugger keyword
 | ||||
|   const _default = 67600; TokenType[TokenType["_default"] = _default] = "_default"; // default keyword
 | ||||
|   const _do = 68624; TokenType[TokenType["_do"] = _do] = "_do"; // do keyword
 | ||||
|   const _else = 69648; TokenType[TokenType["_else"] = _else] = "_else"; // else keyword
 | ||||
|   const _finally = 70672; TokenType[TokenType["_finally"] = _finally] = "_finally"; // finally keyword
 | ||||
|   const _for = 71696; TokenType[TokenType["_for"] = _for] = "_for"; // for keyword
 | ||||
|   const _function = 73232; TokenType[TokenType["_function"] = _function] = "_function"; // function keyword startsExpr
 | ||||
|   const _if = 73744; TokenType[TokenType["_if"] = _if] = "_if"; // if keyword
 | ||||
|   const _return = 74768; TokenType[TokenType["_return"] = _return] = "_return"; // return keyword
 | ||||
|   const _switch = 75792; TokenType[TokenType["_switch"] = _switch] = "_switch"; // switch keyword
 | ||||
|   const _throw = 77456; TokenType[TokenType["_throw"] = _throw] = "_throw"; // throw keyword prefix startsExpr
 | ||||
|   const _try = 77840; TokenType[TokenType["_try"] = _try] = "_try"; // try keyword
 | ||||
|   const _var = 78864; TokenType[TokenType["_var"] = _var] = "_var"; // var keyword
 | ||||
|   const _let = 79888; TokenType[TokenType["_let"] = _let] = "_let"; // let keyword
 | ||||
|   const _const = 80912; TokenType[TokenType["_const"] = _const] = "_const"; // const keyword
 | ||||
|   const _while = 81936; TokenType[TokenType["_while"] = _while] = "_while"; // while keyword
 | ||||
|   const _with = 82960; TokenType[TokenType["_with"] = _with] = "_with"; // with keyword
 | ||||
|   const _new = 84496; TokenType[TokenType["_new"] = _new] = "_new"; // new keyword startsExpr
 | ||||
|   const _this = 85520; TokenType[TokenType["_this"] = _this] = "_this"; // this keyword startsExpr
 | ||||
|   const _super = 86544; TokenType[TokenType["_super"] = _super] = "_super"; // super keyword startsExpr
 | ||||
|   const _class = 87568; TokenType[TokenType["_class"] = _class] = "_class"; // class keyword startsExpr
 | ||||
|   const _extends = 88080; TokenType[TokenType["_extends"] = _extends] = "_extends"; // extends keyword
 | ||||
|   const _export = 89104; TokenType[TokenType["_export"] = _export] = "_export"; // export keyword
 | ||||
|   const _import = 90640; TokenType[TokenType["_import"] = _import] = "_import"; // import keyword startsExpr
 | ||||
|   const _yield = 91664; TokenType[TokenType["_yield"] = _yield] = "_yield"; // yield keyword startsExpr
 | ||||
|   const _null = 92688; TokenType[TokenType["_null"] = _null] = "_null"; // null keyword startsExpr
 | ||||
|   const _true = 93712; TokenType[TokenType["_true"] = _true] = "_true"; // true keyword startsExpr
 | ||||
|   const _false = 94736; TokenType[TokenType["_false"] = _false] = "_false"; // false keyword startsExpr
 | ||||
|   const _in = 95256; TokenType[TokenType["_in"] = _in] = "_in"; // in prec:8 keyword
 | ||||
|   const _instanceof = 96280; TokenType[TokenType["_instanceof"] = _instanceof] = "_instanceof"; // instanceof prec:8 keyword
 | ||||
|   const _typeof = 97936; TokenType[TokenType["_typeof"] = _typeof] = "_typeof"; // typeof keyword prefix startsExpr
 | ||||
|   const _void = 98960; TokenType[TokenType["_void"] = _void] = "_void"; // void keyword prefix startsExpr
 | ||||
|   const _delete = 99984; TokenType[TokenType["_delete"] = _delete] = "_delete"; // delete keyword prefix startsExpr
 | ||||
|   const _async = 100880; TokenType[TokenType["_async"] = _async] = "_async"; // async keyword startsExpr
 | ||||
|   const _get = 101904; TokenType[TokenType["_get"] = _get] = "_get"; // get keyword startsExpr
 | ||||
|   const _set = 102928; TokenType[TokenType["_set"] = _set] = "_set"; // set keyword startsExpr
 | ||||
|   const _declare = 103952; TokenType[TokenType["_declare"] = _declare] = "_declare"; // declare keyword startsExpr
 | ||||
|   const _readonly = 104976; TokenType[TokenType["_readonly"] = _readonly] = "_readonly"; // readonly keyword startsExpr
 | ||||
|   const _abstract = 106000; TokenType[TokenType["_abstract"] = _abstract] = "_abstract"; // abstract keyword startsExpr
 | ||||
|   const _static = 107024; TokenType[TokenType["_static"] = _static] = "_static"; // static keyword startsExpr
 | ||||
|   const _public = 107536; TokenType[TokenType["_public"] = _public] = "_public"; // public keyword
 | ||||
|   const _private = 108560; TokenType[TokenType["_private"] = _private] = "_private"; // private keyword
 | ||||
|   const _protected = 109584; TokenType[TokenType["_protected"] = _protected] = "_protected"; // protected keyword
 | ||||
|   const _override = 110608; TokenType[TokenType["_override"] = _override] = "_override"; // override keyword
 | ||||
|   const _as = 112144; TokenType[TokenType["_as"] = _as] = "_as"; // as keyword startsExpr
 | ||||
|   const _enum = 113168; TokenType[TokenType["_enum"] = _enum] = "_enum"; // enum keyword startsExpr
 | ||||
|   const _type = 114192; TokenType[TokenType["_type"] = _type] = "_type"; // type keyword startsExpr
 | ||||
|   const _implements = 115216; TokenType[TokenType["_implements"] = _implements] = "_implements"; // implements keyword startsExpr
 | ||||
| })(TokenType || (exports.TokenType = TokenType = {})); | ||||
|  function formatTokenType(tokenType) { | ||||
|   switch (tokenType) { | ||||
|     case TokenType.num: | ||||
|       return "num"; | ||||
|     case TokenType.bigint: | ||||
|       return "bigint"; | ||||
|     case TokenType.decimal: | ||||
|       return "decimal"; | ||||
|     case TokenType.regexp: | ||||
|       return "regexp"; | ||||
|     case TokenType.string: | ||||
|       return "string"; | ||||
|     case TokenType.name: | ||||
|       return "name"; | ||||
|     case TokenType.eof: | ||||
|       return "eof"; | ||||
|     case TokenType.bracketL: | ||||
|       return "["; | ||||
|     case TokenType.bracketR: | ||||
|       return "]"; | ||||
|     case TokenType.braceL: | ||||
|       return "{"; | ||||
|     case TokenType.braceBarL: | ||||
|       return "{|"; | ||||
|     case TokenType.braceR: | ||||
|       return "}"; | ||||
|     case TokenType.braceBarR: | ||||
|       return "|}"; | ||||
|     case TokenType.parenL: | ||||
|       return "("; | ||||
|     case TokenType.parenR: | ||||
|       return ")"; | ||||
|     case TokenType.comma: | ||||
|       return ","; | ||||
|     case TokenType.semi: | ||||
|       return ";"; | ||||
|     case TokenType.colon: | ||||
|       return ":"; | ||||
|     case TokenType.doubleColon: | ||||
|       return "::"; | ||||
|     case TokenType.dot: | ||||
|       return "."; | ||||
|     case TokenType.question: | ||||
|       return "?"; | ||||
|     case TokenType.questionDot: | ||||
|       return "?."; | ||||
|     case TokenType.arrow: | ||||
|       return "=>"; | ||||
|     case TokenType.template: | ||||
|       return "template"; | ||||
|     case TokenType.ellipsis: | ||||
|       return "..."; | ||||
|     case TokenType.backQuote: | ||||
|       return "`"; | ||||
|     case TokenType.dollarBraceL: | ||||
|       return "${"; | ||||
|     case TokenType.at: | ||||
|       return "@"; | ||||
|     case TokenType.hash: | ||||
|       return "#"; | ||||
|     case TokenType.eq: | ||||
|       return "="; | ||||
|     case TokenType.assign: | ||||
|       return "_="; | ||||
|     case TokenType.preIncDec: | ||||
|       return "++/--"; | ||||
|     case TokenType.postIncDec: | ||||
|       return "++/--"; | ||||
|     case TokenType.bang: | ||||
|       return "!"; | ||||
|     case TokenType.tilde: | ||||
|       return "~"; | ||||
|     case TokenType.pipeline: | ||||
|       return "|>"; | ||||
|     case TokenType.nullishCoalescing: | ||||
|       return "??"; | ||||
|     case TokenType.logicalOR: | ||||
|       return "||"; | ||||
|     case TokenType.logicalAND: | ||||
|       return "&&"; | ||||
|     case TokenType.bitwiseOR: | ||||
|       return "|"; | ||||
|     case TokenType.bitwiseXOR: | ||||
|       return "^"; | ||||
|     case TokenType.bitwiseAND: | ||||
|       return "&"; | ||||
|     case TokenType.equality: | ||||
|       return "==/!="; | ||||
|     case TokenType.lessThan: | ||||
|       return "<"; | ||||
|     case TokenType.greaterThan: | ||||
|       return ">"; | ||||
|     case TokenType.relationalOrEqual: | ||||
|       return "<=/>="; | ||||
|     case TokenType.bitShiftL: | ||||
|       return "<<"; | ||||
|     case TokenType.bitShiftR: | ||||
|       return ">>/>>>"; | ||||
|     case TokenType.plus: | ||||
|       return "+"; | ||||
|     case TokenType.minus: | ||||
|       return "-"; | ||||
|     case TokenType.modulo: | ||||
|       return "%"; | ||||
|     case TokenType.star: | ||||
|       return "*"; | ||||
|     case TokenType.slash: | ||||
|       return "/"; | ||||
|     case TokenType.exponent: | ||||
|       return "**"; | ||||
|     case TokenType.jsxName: | ||||
|       return "jsxName"; | ||||
|     case TokenType.jsxText: | ||||
|       return "jsxText"; | ||||
|     case TokenType.jsxEmptyText: | ||||
|       return "jsxEmptyText"; | ||||
|     case TokenType.jsxTagStart: | ||||
|       return "jsxTagStart"; | ||||
|     case TokenType.jsxTagEnd: | ||||
|       return "jsxTagEnd"; | ||||
|     case TokenType.typeParameterStart: | ||||
|       return "typeParameterStart"; | ||||
|     case TokenType.nonNullAssertion: | ||||
|       return "nonNullAssertion"; | ||||
|     case TokenType._break: | ||||
|       return "break"; | ||||
|     case TokenType._case: | ||||
|       return "case"; | ||||
|     case TokenType._catch: | ||||
|       return "catch"; | ||||
|     case TokenType._continue: | ||||
|       return "continue"; | ||||
|     case TokenType._debugger: | ||||
|       return "debugger"; | ||||
|     case TokenType._default: | ||||
|       return "default"; | ||||
|     case TokenType._do: | ||||
|       return "do"; | ||||
|     case TokenType._else: | ||||
|       return "else"; | ||||
|     case TokenType._finally: | ||||
|       return "finally"; | ||||
|     case TokenType._for: | ||||
|       return "for"; | ||||
|     case TokenType._function: | ||||
|       return "function"; | ||||
|     case TokenType._if: | ||||
|       return "if"; | ||||
|     case TokenType._return: | ||||
|       return "return"; | ||||
|     case TokenType._switch: | ||||
|       return "switch"; | ||||
|     case TokenType._throw: | ||||
|       return "throw"; | ||||
|     case TokenType._try: | ||||
|       return "try"; | ||||
|     case TokenType._var: | ||||
|       return "var"; | ||||
|     case TokenType._let: | ||||
|       return "let"; | ||||
|     case TokenType._const: | ||||
|       return "const"; | ||||
|     case TokenType._while: | ||||
|       return "while"; | ||||
|     case TokenType._with: | ||||
|       return "with"; | ||||
|     case TokenType._new: | ||||
|       return "new"; | ||||
|     case TokenType._this: | ||||
|       return "this"; | ||||
|     case TokenType._super: | ||||
|       return "super"; | ||||
|     case TokenType._class: | ||||
|       return "class"; | ||||
|     case TokenType._extends: | ||||
|       return "extends"; | ||||
|     case TokenType._export: | ||||
|       return "export"; | ||||
|     case TokenType._import: | ||||
|       return "import"; | ||||
|     case TokenType._yield: | ||||
|       return "yield"; | ||||
|     case TokenType._null: | ||||
|       return "null"; | ||||
|     case TokenType._true: | ||||
|       return "true"; | ||||
|     case TokenType._false: | ||||
|       return "false"; | ||||
|     case TokenType._in: | ||||
|       return "in"; | ||||
|     case TokenType._instanceof: | ||||
|       return "instanceof"; | ||||
|     case TokenType._typeof: | ||||
|       return "typeof"; | ||||
|     case TokenType._void: | ||||
|       return "void"; | ||||
|     case TokenType._delete: | ||||
|       return "delete"; | ||||
|     case TokenType._async: | ||||
|       return "async"; | ||||
|     case TokenType._get: | ||||
|       return "get"; | ||||
|     case TokenType._set: | ||||
|       return "set"; | ||||
|     case TokenType._declare: | ||||
|       return "declare"; | ||||
|     case TokenType._readonly: | ||||
|       return "readonly"; | ||||
|     case TokenType._abstract: | ||||
|       return "abstract"; | ||||
|     case TokenType._static: | ||||
|       return "static"; | ||||
|     case TokenType._public: | ||||
|       return "public"; | ||||
|     case TokenType._private: | ||||
|       return "private"; | ||||
|     case TokenType._protected: | ||||
|       return "protected"; | ||||
|     case TokenType._override: | ||||
|       return "override"; | ||||
|     case TokenType._as: | ||||
|       return "as"; | ||||
|     case TokenType._enum: | ||||
|       return "enum"; | ||||
|     case TokenType._type: | ||||
|       return "type"; | ||||
|     case TokenType._implements: | ||||
|       return "implements"; | ||||
|     default: | ||||
|       return ""; | ||||
|   } | ||||
| } exports.formatTokenType = formatTokenType; | ||||
							
								
								
									
										60
									
								
								node_modules/sucrase/dist/parser/traverser/base.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										60
									
								
								node_modules/sucrase/dist/parser/traverser/base.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,60 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }var _state = require('../tokenizer/state'); var _state2 = _interopRequireDefault(_state); | ||||
| var _charcodes = require('../util/charcodes'); | ||||
| 
 | ||||
|  exports.isJSXEnabled; | ||||
|  exports.isTypeScriptEnabled; | ||||
|  exports.isFlowEnabled; | ||||
|  exports.state; | ||||
|  exports.input; | ||||
|  exports.nextContextId; | ||||
| 
 | ||||
|  function getNextContextId() { | ||||
|   return exports.nextContextId++; | ||||
| } exports.getNextContextId = getNextContextId; | ||||
| 
 | ||||
| // eslint-disable-next-line @typescript-eslint/no-explicit-any
 | ||||
|  function augmentError(error) { | ||||
|   if ("pos" in error) { | ||||
|     const loc = locationForIndex(error.pos); | ||||
|     error.message += ` (${loc.line}:${loc.column})`; | ||||
|     error.loc = loc; | ||||
|   } | ||||
|   return error; | ||||
| } exports.augmentError = augmentError; | ||||
| 
 | ||||
|  class Loc { | ||||
|    | ||||
|    | ||||
|   constructor(line, column) { | ||||
|     this.line = line; | ||||
|     this.column = column; | ||||
|   } | ||||
| } exports.Loc = Loc; | ||||
| 
 | ||||
|  function locationForIndex(pos) { | ||||
|   let line = 1; | ||||
|   let column = 1; | ||||
|   for (let i = 0; i < pos; i++) { | ||||
|     if (exports.input.charCodeAt(i) === _charcodes.charCodes.lineFeed) { | ||||
|       line++; | ||||
|       column = 1; | ||||
|     } else { | ||||
|       column++; | ||||
|     } | ||||
|   } | ||||
|   return new Loc(line, column); | ||||
| } exports.locationForIndex = locationForIndex; | ||||
| 
 | ||||
|  function initParser( | ||||
|   inputCode, | ||||
|   isJSXEnabledArg, | ||||
|   isTypeScriptEnabledArg, | ||||
|   isFlowEnabledArg, | ||||
| ) { | ||||
|   exports.input = inputCode; | ||||
|   exports.state = new (0, _state2.default)(); | ||||
|   exports.nextContextId = 1; | ||||
|   exports.isJSXEnabled = isJSXEnabledArg; | ||||
|   exports.isTypeScriptEnabled = isTypeScriptEnabledArg; | ||||
|   exports.isFlowEnabled = isFlowEnabledArg; | ||||
| } exports.initParser = initParser; | ||||
							
								
								
									
										1022
									
								
								node_modules/sucrase/dist/parser/traverser/expression.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										1022
									
								
								node_modules/sucrase/dist/parser/traverser/expression.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
										
											
												File diff suppressed because it is too large
												Load diff
											
										
									
								
							
							
								
								
									
										18
									
								
								node_modules/sucrase/dist/parser/traverser/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										18
									
								
								node_modules/sucrase/dist/parser/traverser/index.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,18 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); | ||||
| var _index = require('../tokenizer/index'); | ||||
| var _charcodes = require('../util/charcodes'); | ||||
| var _base = require('./base'); | ||||
| var _statement = require('./statement'); | ||||
| 
 | ||||
|  function parseFile() { | ||||
|   // If enabled, skip leading hashbang line.
 | ||||
|   if ( | ||||
|     _base.state.pos === 0 && | ||||
|     _base.input.charCodeAt(0) === _charcodes.charCodes.numberSign && | ||||
|     _base.input.charCodeAt(1) === _charcodes.charCodes.exclamationMark | ||||
|   ) { | ||||
|     _index.skipLineComment.call(void 0, 2); | ||||
|   } | ||||
|   _index.nextToken.call(void 0, ); | ||||
|   return _statement.parseTopLevel.call(void 0, ); | ||||
| } exports.parseFile = parseFile; | ||||
							
								
								
									
										159
									
								
								node_modules/sucrase/dist/parser/traverser/lval.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										159
									
								
								node_modules/sucrase/dist/parser/traverser/lval.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,159 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true});var _flow = require('../plugins/flow'); | ||||
| var _typescript = require('../plugins/typescript'); | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| var _index = require('../tokenizer/index'); | ||||
| var _keywords = require('../tokenizer/keywords'); | ||||
| var _types = require('../tokenizer/types'); | ||||
| var _base = require('./base'); | ||||
| var _expression = require('./expression'); | ||||
| var _util = require('./util'); | ||||
| 
 | ||||
|  function parseSpread() { | ||||
|   _index.next.call(void 0, ); | ||||
|   _expression.parseMaybeAssign.call(void 0, false); | ||||
| } exports.parseSpread = parseSpread; | ||||
| 
 | ||||
|  function parseRest(isBlockScope) { | ||||
|   _index.next.call(void 0, ); | ||||
|   parseBindingAtom(isBlockScope); | ||||
| } exports.parseRest = parseRest; | ||||
| 
 | ||||
|  function parseBindingIdentifier(isBlockScope) { | ||||
|   _expression.parseIdentifier.call(void 0, ); | ||||
|   markPriorBindingIdentifier(isBlockScope); | ||||
| } exports.parseBindingIdentifier = parseBindingIdentifier; | ||||
| 
 | ||||
|  function parseImportedIdentifier() { | ||||
|   _expression.parseIdentifier.call(void 0, ); | ||||
|   _base.state.tokens[_base.state.tokens.length - 1].identifierRole = _index.IdentifierRole.ImportDeclaration; | ||||
| } exports.parseImportedIdentifier = parseImportedIdentifier; | ||||
| 
 | ||||
|  function markPriorBindingIdentifier(isBlockScope) { | ||||
|   let identifierRole; | ||||
|   if (_base.state.scopeDepth === 0) { | ||||
|     identifierRole = _index.IdentifierRole.TopLevelDeclaration; | ||||
|   } else if (isBlockScope) { | ||||
|     identifierRole = _index.IdentifierRole.BlockScopedDeclaration; | ||||
|   } else { | ||||
|     identifierRole = _index.IdentifierRole.FunctionScopedDeclaration; | ||||
|   } | ||||
|   _base.state.tokens[_base.state.tokens.length - 1].identifierRole = identifierRole; | ||||
| } exports.markPriorBindingIdentifier = markPriorBindingIdentifier; | ||||
| 
 | ||||
| // Parses lvalue (assignable) atom.
 | ||||
|  function parseBindingAtom(isBlockScope) { | ||||
|   switch (_base.state.type) { | ||||
|     case _types.TokenType._this: { | ||||
|       // In TypeScript, "this" may be the name of a parameter, so allow it.
 | ||||
|       const oldIsType = _index.pushTypeContext.call(void 0, 0); | ||||
|       _index.next.call(void 0, ); | ||||
|       _index.popTypeContext.call(void 0, oldIsType); | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     case _types.TokenType._yield: | ||||
|     case _types.TokenType.name: { | ||||
|       _base.state.type = _types.TokenType.name; | ||||
|       parseBindingIdentifier(isBlockScope); | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     case _types.TokenType.bracketL: { | ||||
|       _index.next.call(void 0, ); | ||||
|       parseBindingList(_types.TokenType.bracketR, isBlockScope, true /* allowEmpty */); | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     case _types.TokenType.braceL: | ||||
|       _expression.parseObj.call(void 0, true, isBlockScope); | ||||
|       return; | ||||
| 
 | ||||
|     default: | ||||
|       _util.unexpected.call(void 0, ); | ||||
|   } | ||||
| } exports.parseBindingAtom = parseBindingAtom; | ||||
| 
 | ||||
|  function parseBindingList( | ||||
|   close, | ||||
|   isBlockScope, | ||||
|   allowEmpty = false, | ||||
|   allowModifiers = false, | ||||
|   contextId = 0, | ||||
| ) { | ||||
|   let first = true; | ||||
| 
 | ||||
|   let hasRemovedComma = false; | ||||
|   const firstItemTokenIndex = _base.state.tokens.length; | ||||
| 
 | ||||
|   while (!_index.eat.call(void 0, close) && !_base.state.error) { | ||||
|     if (first) { | ||||
|       first = false; | ||||
|     } else { | ||||
|       _util.expect.call(void 0, _types.TokenType.comma); | ||||
|       _base.state.tokens[_base.state.tokens.length - 1].contextId = contextId; | ||||
|       // After a "this" type in TypeScript, we need to set the following comma (if any) to also be
 | ||||
|       // a type token so that it will be removed.
 | ||||
|       if (!hasRemovedComma && _base.state.tokens[firstItemTokenIndex].isType) { | ||||
|         _base.state.tokens[_base.state.tokens.length - 1].isType = true; | ||||
|         hasRemovedComma = true; | ||||
|       } | ||||
|     } | ||||
|     if (allowEmpty && _index.match.call(void 0, _types.TokenType.comma)) { | ||||
|       // Empty item; nothing further to parse for this item.
 | ||||
|     } else if (_index.eat.call(void 0, close)) { | ||||
|       break; | ||||
|     } else if (_index.match.call(void 0, _types.TokenType.ellipsis)) { | ||||
|       parseRest(isBlockScope); | ||||
|       parseAssignableListItemTypes(); | ||||
|       // Support rest element trailing commas allowed by TypeScript <2.9.
 | ||||
|       _index.eat.call(void 0, _types.TokenType.comma); | ||||
|       _util.expect.call(void 0, close); | ||||
|       break; | ||||
|     } else { | ||||
|       parseAssignableListItem(allowModifiers, isBlockScope); | ||||
|     } | ||||
|   } | ||||
| } exports.parseBindingList = parseBindingList; | ||||
| 
 | ||||
| function parseAssignableListItem(allowModifiers, isBlockScope) { | ||||
|   if (allowModifiers) { | ||||
|     _typescript.tsParseModifiers.call(void 0, [ | ||||
|       _keywords.ContextualKeyword._public, | ||||
|       _keywords.ContextualKeyword._protected, | ||||
|       _keywords.ContextualKeyword._private, | ||||
|       _keywords.ContextualKeyword._readonly, | ||||
|       _keywords.ContextualKeyword._override, | ||||
|     ]); | ||||
|   } | ||||
| 
 | ||||
|   parseMaybeDefault(isBlockScope); | ||||
|   parseAssignableListItemTypes(); | ||||
|   parseMaybeDefault(isBlockScope, true /* leftAlreadyParsed */); | ||||
| } | ||||
| 
 | ||||
| function parseAssignableListItemTypes() { | ||||
|   if (_base.isFlowEnabled) { | ||||
|     _flow.flowParseAssignableListItemTypes.call(void 0, ); | ||||
|   } else if (_base.isTypeScriptEnabled) { | ||||
|     _typescript.tsParseAssignableListItemTypes.call(void 0, ); | ||||
|   } | ||||
| } | ||||
| 
 | ||||
| // Parses assignment pattern around given atom if possible.
 | ||||
|  function parseMaybeDefault(isBlockScope, leftAlreadyParsed = false) { | ||||
|   if (!leftAlreadyParsed) { | ||||
|     parseBindingAtom(isBlockScope); | ||||
|   } | ||||
|   if (!_index.eat.call(void 0, _types.TokenType.eq)) { | ||||
|     return; | ||||
|   } | ||||
|   const eqIndex = _base.state.tokens.length - 1; | ||||
|   _expression.parseMaybeAssign.call(void 0, ); | ||||
|   _base.state.tokens[eqIndex].rhsEndIndex = _base.state.tokens.length; | ||||
| } exports.parseMaybeDefault = parseMaybeDefault; | ||||
							
								
								
									
										1332
									
								
								node_modules/sucrase/dist/parser/traverser/statement.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										1332
									
								
								node_modules/sucrase/dist/parser/traverser/statement.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
										
											
												File diff suppressed because it is too large
												Load diff
											
										
									
								
							
							
								
								
									
										104
									
								
								node_modules/sucrase/dist/parser/traverser/util.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										104
									
								
								node_modules/sucrase/dist/parser/traverser/util.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,104 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true});var _index = require('../tokenizer/index'); | ||||
| 
 | ||||
| var _types = require('../tokenizer/types'); | ||||
| var _charcodes = require('../util/charcodes'); | ||||
| var _base = require('./base'); | ||||
| 
 | ||||
| // ## Parser utilities
 | ||||
| 
 | ||||
| // Tests whether parsed token is a contextual keyword.
 | ||||
|  function isContextual(contextualKeyword) { | ||||
|   return _base.state.contextualKeyword === contextualKeyword; | ||||
| } exports.isContextual = isContextual; | ||||
| 
 | ||||
|  function isLookaheadContextual(contextualKeyword) { | ||||
|   const l = _index.lookaheadTypeAndKeyword.call(void 0, ); | ||||
|   return l.type === _types.TokenType.name && l.contextualKeyword === contextualKeyword; | ||||
| } exports.isLookaheadContextual = isLookaheadContextual; | ||||
| 
 | ||||
| // Consumes contextual keyword if possible.
 | ||||
|  function eatContextual(contextualKeyword) { | ||||
|   return _base.state.contextualKeyword === contextualKeyword && _index.eat.call(void 0, _types.TokenType.name); | ||||
| } exports.eatContextual = eatContextual; | ||||
| 
 | ||||
| // Asserts that following token is given contextual keyword.
 | ||||
|  function expectContextual(contextualKeyword) { | ||||
|   if (!eatContextual(contextualKeyword)) { | ||||
|     unexpected(); | ||||
|   } | ||||
| } exports.expectContextual = expectContextual; | ||||
| 
 | ||||
| // Test whether a semicolon can be inserted at the current position.
 | ||||
|  function canInsertSemicolon() { | ||||
|   return _index.match.call(void 0, _types.TokenType.eof) || _index.match.call(void 0, _types.TokenType.braceR) || hasPrecedingLineBreak(); | ||||
| } exports.canInsertSemicolon = canInsertSemicolon; | ||||
| 
 | ||||
|  function hasPrecedingLineBreak() { | ||||
|   const prevToken = _base.state.tokens[_base.state.tokens.length - 1]; | ||||
|   const lastTokEnd = prevToken ? prevToken.end : 0; | ||||
|   for (let i = lastTokEnd; i < _base.state.start; i++) { | ||||
|     const code = _base.input.charCodeAt(i); | ||||
|     if ( | ||||
|       code === _charcodes.charCodes.lineFeed || | ||||
|       code === _charcodes.charCodes.carriageReturn || | ||||
|       code === 0x2028 || | ||||
|       code === 0x2029 | ||||
|     ) { | ||||
|       return true; | ||||
|     } | ||||
|   } | ||||
|   return false; | ||||
| } exports.hasPrecedingLineBreak = hasPrecedingLineBreak; | ||||
| 
 | ||||
|  function hasFollowingLineBreak() { | ||||
|   const nextStart = _index.nextTokenStart.call(void 0, ); | ||||
|   for (let i = _base.state.end; i < nextStart; i++) { | ||||
|     const code = _base.input.charCodeAt(i); | ||||
|     if ( | ||||
|       code === _charcodes.charCodes.lineFeed || | ||||
|       code === _charcodes.charCodes.carriageReturn || | ||||
|       code === 0x2028 || | ||||
|       code === 0x2029 | ||||
|     ) { | ||||
|       return true; | ||||
|     } | ||||
|   } | ||||
|   return false; | ||||
| } exports.hasFollowingLineBreak = hasFollowingLineBreak; | ||||
| 
 | ||||
|  function isLineTerminator() { | ||||
|   return _index.eat.call(void 0, _types.TokenType.semi) || canInsertSemicolon(); | ||||
| } exports.isLineTerminator = isLineTerminator; | ||||
| 
 | ||||
| // Consume a semicolon, or, failing that, see if we are allowed to
 | ||||
| // pretend that there is a semicolon at this position.
 | ||||
|  function semicolon() { | ||||
|   if (!isLineTerminator()) { | ||||
|     unexpected('Unexpected token, expected ";"'); | ||||
|   } | ||||
| } exports.semicolon = semicolon; | ||||
| 
 | ||||
| // Expect a token of a given type. If found, consume it, otherwise,
 | ||||
| // raise an unexpected token error at given pos.
 | ||||
|  function expect(type) { | ||||
|   const matched = _index.eat.call(void 0, type); | ||||
|   if (!matched) { | ||||
|     unexpected(`Unexpected token, expected "${_types.formatTokenType.call(void 0, type)}"`); | ||||
|   } | ||||
| } exports.expect = expect; | ||||
| 
 | ||||
| /** | ||||
|  * Transition the parser to an error state. All code needs to be written to naturally unwind in this | ||||
|  * state, which allows us to backtrack without exceptions and without error plumbing everywhere. | ||||
|  */ | ||||
|  function unexpected(message = "Unexpected token", pos = _base.state.start) { | ||||
|   if (_base.state.error) { | ||||
|     return; | ||||
|   } | ||||
|   // eslint-disable-next-line @typescript-eslint/no-explicit-any
 | ||||
|   const err = new SyntaxError(message); | ||||
|   err.pos = pos; | ||||
|   _base.state.error = err; | ||||
|   _base.state.pos = _base.input.length; | ||||
|   _index.finishToken.call(void 0, _types.TokenType.eof); | ||||
| } exports.unexpected = unexpected; | ||||
							
								
								
									
										115
									
								
								node_modules/sucrase/dist/parser/util/charcodes.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										115
									
								
								node_modules/sucrase/dist/parser/util/charcodes.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,115 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true});var charCodes; (function (charCodes) { | ||||
|   const backSpace = 8; charCodes[charCodes["backSpace"] = backSpace] = "backSpace"; | ||||
|   const lineFeed = 10; charCodes[charCodes["lineFeed"] = lineFeed] = "lineFeed"; //  '\n'
 | ||||
|   const tab = 9; charCodes[charCodes["tab"] = tab] = "tab"; //  '\t'
 | ||||
|   const carriageReturn = 13; charCodes[charCodes["carriageReturn"] = carriageReturn] = "carriageReturn"; //  '\r'
 | ||||
|   const shiftOut = 14; charCodes[charCodes["shiftOut"] = shiftOut] = "shiftOut"; | ||||
|   const space = 32; charCodes[charCodes["space"] = space] = "space"; | ||||
|   const exclamationMark = 33; charCodes[charCodes["exclamationMark"] = exclamationMark] = "exclamationMark"; //  '!'
 | ||||
|   const quotationMark = 34; charCodes[charCodes["quotationMark"] = quotationMark] = "quotationMark"; //  '"'
 | ||||
|   const numberSign = 35; charCodes[charCodes["numberSign"] = numberSign] = "numberSign"; //  '#'
 | ||||
|   const dollarSign = 36; charCodes[charCodes["dollarSign"] = dollarSign] = "dollarSign"; //  '$'
 | ||||
|   const percentSign = 37; charCodes[charCodes["percentSign"] = percentSign] = "percentSign"; //  '%'
 | ||||
|   const ampersand = 38; charCodes[charCodes["ampersand"] = ampersand] = "ampersand"; //  '&'
 | ||||
|   const apostrophe = 39; charCodes[charCodes["apostrophe"] = apostrophe] = "apostrophe"; //  '''
 | ||||
|   const leftParenthesis = 40; charCodes[charCodes["leftParenthesis"] = leftParenthesis] = "leftParenthesis"; //  '('
 | ||||
|   const rightParenthesis = 41; charCodes[charCodes["rightParenthesis"] = rightParenthesis] = "rightParenthesis"; //  ')'
 | ||||
|   const asterisk = 42; charCodes[charCodes["asterisk"] = asterisk] = "asterisk"; //  '*'
 | ||||
|   const plusSign = 43; charCodes[charCodes["plusSign"] = plusSign] = "plusSign"; //  '+'
 | ||||
|   const comma = 44; charCodes[charCodes["comma"] = comma] = "comma"; //  ','
 | ||||
|   const dash = 45; charCodes[charCodes["dash"] = dash] = "dash"; //  '-'
 | ||||
|   const dot = 46; charCodes[charCodes["dot"] = dot] = "dot"; //  '.'
 | ||||
|   const slash = 47; charCodes[charCodes["slash"] = slash] = "slash"; //  '/'
 | ||||
|   const digit0 = 48; charCodes[charCodes["digit0"] = digit0] = "digit0"; //  '0'
 | ||||
|   const digit1 = 49; charCodes[charCodes["digit1"] = digit1] = "digit1"; //  '1'
 | ||||
|   const digit2 = 50; charCodes[charCodes["digit2"] = digit2] = "digit2"; //  '2'
 | ||||
|   const digit3 = 51; charCodes[charCodes["digit3"] = digit3] = "digit3"; //  '3'
 | ||||
|   const digit4 = 52; charCodes[charCodes["digit4"] = digit4] = "digit4"; //  '4'
 | ||||
|   const digit5 = 53; charCodes[charCodes["digit5"] = digit5] = "digit5"; //  '5'
 | ||||
|   const digit6 = 54; charCodes[charCodes["digit6"] = digit6] = "digit6"; //  '6'
 | ||||
|   const digit7 = 55; charCodes[charCodes["digit7"] = digit7] = "digit7"; //  '7'
 | ||||
|   const digit8 = 56; charCodes[charCodes["digit8"] = digit8] = "digit8"; //  '8'
 | ||||
|   const digit9 = 57; charCodes[charCodes["digit9"] = digit9] = "digit9"; //  '9'
 | ||||
|   const colon = 58; charCodes[charCodes["colon"] = colon] = "colon"; //  ':'
 | ||||
|   const semicolon = 59; charCodes[charCodes["semicolon"] = semicolon] = "semicolon"; //  ';'
 | ||||
|   const lessThan = 60; charCodes[charCodes["lessThan"] = lessThan] = "lessThan"; //  '<'
 | ||||
|   const equalsTo = 61; charCodes[charCodes["equalsTo"] = equalsTo] = "equalsTo"; //  '='
 | ||||
|   const greaterThan = 62; charCodes[charCodes["greaterThan"] = greaterThan] = "greaterThan"; //  '>'
 | ||||
|   const questionMark = 63; charCodes[charCodes["questionMark"] = questionMark] = "questionMark"; //  '?'
 | ||||
|   const atSign = 64; charCodes[charCodes["atSign"] = atSign] = "atSign"; //  '@'
 | ||||
|   const uppercaseA = 65; charCodes[charCodes["uppercaseA"] = uppercaseA] = "uppercaseA"; //  'A'
 | ||||
|   const uppercaseB = 66; charCodes[charCodes["uppercaseB"] = uppercaseB] = "uppercaseB"; //  'B'
 | ||||
|   const uppercaseC = 67; charCodes[charCodes["uppercaseC"] = uppercaseC] = "uppercaseC"; //  'C'
 | ||||
|   const uppercaseD = 68; charCodes[charCodes["uppercaseD"] = uppercaseD] = "uppercaseD"; //  'D'
 | ||||
|   const uppercaseE = 69; charCodes[charCodes["uppercaseE"] = uppercaseE] = "uppercaseE"; //  'E'
 | ||||
|   const uppercaseF = 70; charCodes[charCodes["uppercaseF"] = uppercaseF] = "uppercaseF"; //  'F'
 | ||||
|   const uppercaseG = 71; charCodes[charCodes["uppercaseG"] = uppercaseG] = "uppercaseG"; //  'G'
 | ||||
|   const uppercaseH = 72; charCodes[charCodes["uppercaseH"] = uppercaseH] = "uppercaseH"; //  'H'
 | ||||
|   const uppercaseI = 73; charCodes[charCodes["uppercaseI"] = uppercaseI] = "uppercaseI"; //  'I'
 | ||||
|   const uppercaseJ = 74; charCodes[charCodes["uppercaseJ"] = uppercaseJ] = "uppercaseJ"; //  'J'
 | ||||
|   const uppercaseK = 75; charCodes[charCodes["uppercaseK"] = uppercaseK] = "uppercaseK"; //  'K'
 | ||||
|   const uppercaseL = 76; charCodes[charCodes["uppercaseL"] = uppercaseL] = "uppercaseL"; //  'L'
 | ||||
|   const uppercaseM = 77; charCodes[charCodes["uppercaseM"] = uppercaseM] = "uppercaseM"; //  'M'
 | ||||
|   const uppercaseN = 78; charCodes[charCodes["uppercaseN"] = uppercaseN] = "uppercaseN"; //  'N'
 | ||||
|   const uppercaseO = 79; charCodes[charCodes["uppercaseO"] = uppercaseO] = "uppercaseO"; //  'O'
 | ||||
|   const uppercaseP = 80; charCodes[charCodes["uppercaseP"] = uppercaseP] = "uppercaseP"; //  'P'
 | ||||
|   const uppercaseQ = 81; charCodes[charCodes["uppercaseQ"] = uppercaseQ] = "uppercaseQ"; //  'Q'
 | ||||
|   const uppercaseR = 82; charCodes[charCodes["uppercaseR"] = uppercaseR] = "uppercaseR"; //  'R'
 | ||||
|   const uppercaseS = 83; charCodes[charCodes["uppercaseS"] = uppercaseS] = "uppercaseS"; //  'S'
 | ||||
|   const uppercaseT = 84; charCodes[charCodes["uppercaseT"] = uppercaseT] = "uppercaseT"; //  'T'
 | ||||
|   const uppercaseU = 85; charCodes[charCodes["uppercaseU"] = uppercaseU] = "uppercaseU"; //  'U'
 | ||||
|   const uppercaseV = 86; charCodes[charCodes["uppercaseV"] = uppercaseV] = "uppercaseV"; //  'V'
 | ||||
|   const uppercaseW = 87; charCodes[charCodes["uppercaseW"] = uppercaseW] = "uppercaseW"; //  'W'
 | ||||
|   const uppercaseX = 88; charCodes[charCodes["uppercaseX"] = uppercaseX] = "uppercaseX"; //  'X'
 | ||||
|   const uppercaseY = 89; charCodes[charCodes["uppercaseY"] = uppercaseY] = "uppercaseY"; //  'Y'
 | ||||
|   const uppercaseZ = 90; charCodes[charCodes["uppercaseZ"] = uppercaseZ] = "uppercaseZ"; //  'Z'
 | ||||
|   const leftSquareBracket = 91; charCodes[charCodes["leftSquareBracket"] = leftSquareBracket] = "leftSquareBracket"; //  '['
 | ||||
|   const backslash = 92; charCodes[charCodes["backslash"] = backslash] = "backslash"; //  '\    '
 | ||||
|   const rightSquareBracket = 93; charCodes[charCodes["rightSquareBracket"] = rightSquareBracket] = "rightSquareBracket"; //  ']'
 | ||||
|   const caret = 94; charCodes[charCodes["caret"] = caret] = "caret"; //  '^'
 | ||||
|   const underscore = 95; charCodes[charCodes["underscore"] = underscore] = "underscore"; //  '_'
 | ||||
|   const graveAccent = 96; charCodes[charCodes["graveAccent"] = graveAccent] = "graveAccent"; //  '`'
 | ||||
|   const lowercaseA = 97; charCodes[charCodes["lowercaseA"] = lowercaseA] = "lowercaseA"; //  'a'
 | ||||
|   const lowercaseB = 98; charCodes[charCodes["lowercaseB"] = lowercaseB] = "lowercaseB"; //  'b'
 | ||||
|   const lowercaseC = 99; charCodes[charCodes["lowercaseC"] = lowercaseC] = "lowercaseC"; //  'c'
 | ||||
|   const lowercaseD = 100; charCodes[charCodes["lowercaseD"] = lowercaseD] = "lowercaseD"; //  'd'
 | ||||
|   const lowercaseE = 101; charCodes[charCodes["lowercaseE"] = lowercaseE] = "lowercaseE"; //  'e'
 | ||||
|   const lowercaseF = 102; charCodes[charCodes["lowercaseF"] = lowercaseF] = "lowercaseF"; //  'f'
 | ||||
|   const lowercaseG = 103; charCodes[charCodes["lowercaseG"] = lowercaseG] = "lowercaseG"; //  'g'
 | ||||
|   const lowercaseH = 104; charCodes[charCodes["lowercaseH"] = lowercaseH] = "lowercaseH"; //  'h'
 | ||||
|   const lowercaseI = 105; charCodes[charCodes["lowercaseI"] = lowercaseI] = "lowercaseI"; //  'i'
 | ||||
|   const lowercaseJ = 106; charCodes[charCodes["lowercaseJ"] = lowercaseJ] = "lowercaseJ"; //  'j'
 | ||||
|   const lowercaseK = 107; charCodes[charCodes["lowercaseK"] = lowercaseK] = "lowercaseK"; //  'k'
 | ||||
|   const lowercaseL = 108; charCodes[charCodes["lowercaseL"] = lowercaseL] = "lowercaseL"; //  'l'
 | ||||
|   const lowercaseM = 109; charCodes[charCodes["lowercaseM"] = lowercaseM] = "lowercaseM"; //  'm'
 | ||||
|   const lowercaseN = 110; charCodes[charCodes["lowercaseN"] = lowercaseN] = "lowercaseN"; //  'n'
 | ||||
|   const lowercaseO = 111; charCodes[charCodes["lowercaseO"] = lowercaseO] = "lowercaseO"; //  'o'
 | ||||
|   const lowercaseP = 112; charCodes[charCodes["lowercaseP"] = lowercaseP] = "lowercaseP"; //  'p'
 | ||||
|   const lowercaseQ = 113; charCodes[charCodes["lowercaseQ"] = lowercaseQ] = "lowercaseQ"; //  'q'
 | ||||
|   const lowercaseR = 114; charCodes[charCodes["lowercaseR"] = lowercaseR] = "lowercaseR"; //  'r'
 | ||||
|   const lowercaseS = 115; charCodes[charCodes["lowercaseS"] = lowercaseS] = "lowercaseS"; //  's'
 | ||||
|   const lowercaseT = 116; charCodes[charCodes["lowercaseT"] = lowercaseT] = "lowercaseT"; //  't'
 | ||||
|   const lowercaseU = 117; charCodes[charCodes["lowercaseU"] = lowercaseU] = "lowercaseU"; //  'u'
 | ||||
|   const lowercaseV = 118; charCodes[charCodes["lowercaseV"] = lowercaseV] = "lowercaseV"; //  'v'
 | ||||
|   const lowercaseW = 119; charCodes[charCodes["lowercaseW"] = lowercaseW] = "lowercaseW"; //  'w'
 | ||||
|   const lowercaseX = 120; charCodes[charCodes["lowercaseX"] = lowercaseX] = "lowercaseX"; //  'x'
 | ||||
|   const lowercaseY = 121; charCodes[charCodes["lowercaseY"] = lowercaseY] = "lowercaseY"; //  'y'
 | ||||
|   const lowercaseZ = 122; charCodes[charCodes["lowercaseZ"] = lowercaseZ] = "lowercaseZ"; //  'z'
 | ||||
|   const leftCurlyBrace = 123; charCodes[charCodes["leftCurlyBrace"] = leftCurlyBrace] = "leftCurlyBrace"; //  '{'
 | ||||
|   const verticalBar = 124; charCodes[charCodes["verticalBar"] = verticalBar] = "verticalBar"; //  '|'
 | ||||
|   const rightCurlyBrace = 125; charCodes[charCodes["rightCurlyBrace"] = rightCurlyBrace] = "rightCurlyBrace"; //  '}'
 | ||||
|   const tilde = 126; charCodes[charCodes["tilde"] = tilde] = "tilde"; //  '~'
 | ||||
|   const nonBreakingSpace = 160; charCodes[charCodes["nonBreakingSpace"] = nonBreakingSpace] = "nonBreakingSpace"; | ||||
|   // eslint-disable-next-line no-irregular-whitespace
 | ||||
|   const oghamSpaceMark = 5760; charCodes[charCodes["oghamSpaceMark"] = oghamSpaceMark] = "oghamSpaceMark"; // ' '
 | ||||
|   const lineSeparator = 8232; charCodes[charCodes["lineSeparator"] = lineSeparator] = "lineSeparator"; | ||||
|   const paragraphSeparator = 8233; charCodes[charCodes["paragraphSeparator"] = paragraphSeparator] = "paragraphSeparator"; | ||||
| })(charCodes || (exports.charCodes = charCodes = {})); | ||||
| 
 | ||||
|  function isDigit(code) { | ||||
|   return ( | ||||
|     (code >= charCodes.digit0 && code <= charCodes.digit9) || | ||||
|     (code >= charCodes.lowercaseA && code <= charCodes.lowercaseF) || | ||||
|     (code >= charCodes.uppercaseA && code <= charCodes.uppercaseF) | ||||
|   ); | ||||
| } exports.isDigit = isDigit; | ||||
							
								
								
									
										34
									
								
								node_modules/sucrase/dist/parser/util/identifier.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										34
									
								
								node_modules/sucrase/dist/parser/util/identifier.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,34 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true});var _charcodes = require('./charcodes'); | ||||
| var _whitespace = require('./whitespace'); | ||||
| 
 | ||||
| function computeIsIdentifierChar(code) { | ||||
|   if (code < 48) return code === 36; | ||||
|   if (code < 58) return true; | ||||
|   if (code < 65) return false; | ||||
|   if (code < 91) return true; | ||||
|   if (code < 97) return code === 95; | ||||
|   if (code < 123) return true; | ||||
|   if (code < 128) return false; | ||||
|   throw new Error("Should not be called with non-ASCII char code."); | ||||
| } | ||||
| 
 | ||||
|  const IS_IDENTIFIER_CHAR = new Uint8Array(65536); exports.IS_IDENTIFIER_CHAR = IS_IDENTIFIER_CHAR; | ||||
| for (let i = 0; i < 128; i++) { | ||||
|   exports.IS_IDENTIFIER_CHAR[i] = computeIsIdentifierChar(i) ? 1 : 0; | ||||
| } | ||||
| for (let i = 128; i < 65536; i++) { | ||||
|   exports.IS_IDENTIFIER_CHAR[i] = 1; | ||||
| } | ||||
| // Aside from whitespace and newlines, all characters outside the ASCII space are either
 | ||||
| // identifier characters or invalid. Since we're not performing code validation, we can just
 | ||||
| // treat all invalid characters as identifier characters.
 | ||||
| for (const whitespaceChar of _whitespace.WHITESPACE_CHARS) { | ||||
|   exports.IS_IDENTIFIER_CHAR[whitespaceChar] = 0; | ||||
| } | ||||
| exports.IS_IDENTIFIER_CHAR[0x2028] = 0; | ||||
| exports.IS_IDENTIFIER_CHAR[0x2029] = 0; | ||||
| 
 | ||||
|  const IS_IDENTIFIER_START = exports.IS_IDENTIFIER_CHAR.slice(); exports.IS_IDENTIFIER_START = IS_IDENTIFIER_START; | ||||
| for (let numChar = _charcodes.charCodes.digit0; numChar <= _charcodes.charCodes.digit9; numChar++) { | ||||
|   exports.IS_IDENTIFIER_START[numChar] = 0; | ||||
| } | ||||
							
								
								
									
										33
									
								
								node_modules/sucrase/dist/parser/util/whitespace.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										33
									
								
								node_modules/sucrase/dist/parser/util/whitespace.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,33 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true});var _charcodes = require('./charcodes'); | ||||
| 
 | ||||
| // https://tc39.github.io/ecma262/#sec-white-space
 | ||||
|  const WHITESPACE_CHARS = [ | ||||
|   0x0009, | ||||
|   0x000b, | ||||
|   0x000c, | ||||
|   _charcodes.charCodes.space, | ||||
|   _charcodes.charCodes.nonBreakingSpace, | ||||
|   _charcodes.charCodes.oghamSpaceMark, | ||||
|   0x2000, // EN QUAD
 | ||||
|   0x2001, // EM QUAD
 | ||||
|   0x2002, // EN SPACE
 | ||||
|   0x2003, // EM SPACE
 | ||||
|   0x2004, // THREE-PER-EM SPACE
 | ||||
|   0x2005, // FOUR-PER-EM SPACE
 | ||||
|   0x2006, // SIX-PER-EM SPACE
 | ||||
|   0x2007, // FIGURE SPACE
 | ||||
|   0x2008, // PUNCTUATION SPACE
 | ||||
|   0x2009, // THIN SPACE
 | ||||
|   0x200a, // HAIR SPACE
 | ||||
|   0x202f, // NARROW NO-BREAK SPACE
 | ||||
|   0x205f, // MEDIUM MATHEMATICAL SPACE
 | ||||
|   0x3000, // IDEOGRAPHIC SPACE
 | ||||
|   0xfeff, // ZERO WIDTH NO-BREAK SPACE
 | ||||
| ]; exports.WHITESPACE_CHARS = WHITESPACE_CHARS; | ||||
| 
 | ||||
|  const skipWhiteSpace = /(?:\s|\/\/.*|\/\*[^]*?\*\/)*/g; exports.skipWhiteSpace = skipWhiteSpace; | ||||
| 
 | ||||
|  const IS_WHITESPACE = new Uint8Array(65536); exports.IS_WHITESPACE = IS_WHITESPACE; | ||||
| for (const char of exports.WHITESPACE_CHARS) { | ||||
|   exports.IS_WHITESPACE[char] = 1; | ||||
| } | ||||
							
								
								
									
										88
									
								
								node_modules/sucrase/dist/register.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										88
									
								
								node_modules/sucrase/dist/register.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,88 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireWildcard(obj) { if (obj && obj.__esModule) { return obj; } else { var newObj = {}; if (obj != null) { for (var key in obj) { if (Object.prototype.hasOwnProperty.call(obj, key)) { newObj[key] = obj[key]; } } } newObj.default = obj; return newObj; } }var _pirates = require('pirates'); var pirates = _interopRequireWildcard(_pirates); | ||||
| 
 | ||||
| var _index = require('./index'); | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
|  function addHook( | ||||
|   extension, | ||||
|   sucraseOptions, | ||||
|   hookOptions, | ||||
| ) { | ||||
|   let mergedSucraseOptions = sucraseOptions; | ||||
|   const sucraseOptionsEnvJSON = process.env.SUCRASE_OPTIONS; | ||||
|   if (sucraseOptionsEnvJSON) { | ||||
|     mergedSucraseOptions = {...mergedSucraseOptions, ...JSON.parse(sucraseOptionsEnvJSON)}; | ||||
|   } | ||||
|   return pirates.addHook( | ||||
|     (code, filePath) => { | ||||
|       const {code: transformedCode, sourceMap} = _index.transform.call(void 0, code, { | ||||
|         ...mergedSucraseOptions, | ||||
|         sourceMapOptions: {compiledFilename: filePath}, | ||||
|         filePath, | ||||
|       }); | ||||
|       const mapBase64 = Buffer.from(JSON.stringify(sourceMap)).toString("base64"); | ||||
|       const suffix = `//# sourceMappingURL=data:application/json;charset=utf-8;base64,${mapBase64}`; | ||||
|       return `${transformedCode}\n${suffix}`; | ||||
|     }, | ||||
|     {...hookOptions, exts: [extension]}, | ||||
|   ); | ||||
| } exports.addHook = addHook; | ||||
| 
 | ||||
|  function registerJS(hookOptions) { | ||||
|   return addHook(".js", {transforms: ["imports", "flow", "jsx"]}, hookOptions); | ||||
| } exports.registerJS = registerJS; | ||||
| 
 | ||||
|  function registerJSX(hookOptions) { | ||||
|   return addHook(".jsx", {transforms: ["imports", "flow", "jsx"]}, hookOptions); | ||||
| } exports.registerJSX = registerJSX; | ||||
| 
 | ||||
|  function registerTS(hookOptions) { | ||||
|   return addHook(".ts", {transforms: ["imports", "typescript"]}, hookOptions); | ||||
| } exports.registerTS = registerTS; | ||||
| 
 | ||||
|  function registerTSX(hookOptions) { | ||||
|   return addHook(".tsx", {transforms: ["imports", "typescript", "jsx"]}, hookOptions); | ||||
| } exports.registerTSX = registerTSX; | ||||
| 
 | ||||
|  function registerTSLegacyModuleInterop(hookOptions) { | ||||
|   return addHook( | ||||
|     ".ts", | ||||
|     { | ||||
|       transforms: ["imports", "typescript"], | ||||
|       enableLegacyTypeScriptModuleInterop: true, | ||||
|     }, | ||||
|     hookOptions, | ||||
|   ); | ||||
| } exports.registerTSLegacyModuleInterop = registerTSLegacyModuleInterop; | ||||
| 
 | ||||
|  function registerTSXLegacyModuleInterop(hookOptions) { | ||||
|   return addHook( | ||||
|     ".tsx", | ||||
|     { | ||||
|       transforms: ["imports", "typescript", "jsx"], | ||||
|       enableLegacyTypeScriptModuleInterop: true, | ||||
|     }, | ||||
|     hookOptions, | ||||
|   ); | ||||
| } exports.registerTSXLegacyModuleInterop = registerTSXLegacyModuleInterop; | ||||
| 
 | ||||
|  function registerAll(hookOptions) { | ||||
|   const reverts = [ | ||||
|     registerJS(hookOptions), | ||||
|     registerJSX(hookOptions), | ||||
|     registerTS(hookOptions), | ||||
|     registerTSX(hookOptions), | ||||
|   ]; | ||||
| 
 | ||||
|   return () => { | ||||
|     for (const fn of reverts) { | ||||
|       fn(); | ||||
|     } | ||||
|   }; | ||||
| } exports.registerAll = registerAll; | ||||
							
								
								
									
										916
									
								
								node_modules/sucrase/dist/transformers/CJSImportTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										916
									
								
								node_modules/sucrase/dist/transformers/CJSImportTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,916 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; } | ||||
| 
 | ||||
| 
 | ||||
| var _tokenizer = require('../parser/tokenizer'); | ||||
| var _keywords = require('../parser/tokenizer/keywords'); | ||||
| var _types = require('../parser/tokenizer/types'); | ||||
| 
 | ||||
| var _elideImportEquals = require('../util/elideImportEquals'); var _elideImportEquals2 = _interopRequireDefault(_elideImportEquals); | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| var _getDeclarationInfo = require('../util/getDeclarationInfo'); var _getDeclarationInfo2 = _interopRequireDefault(_getDeclarationInfo); | ||||
| var _getImportExportSpecifierInfo = require('../util/getImportExportSpecifierInfo'); var _getImportExportSpecifierInfo2 = _interopRequireDefault(_getImportExportSpecifierInfo); | ||||
| var _isExportFrom = require('../util/isExportFrom'); var _isExportFrom2 = _interopRequireDefault(_isExportFrom); | ||||
| var _removeMaybeImportAttributes = require('../util/removeMaybeImportAttributes'); | ||||
| var _shouldElideDefaultExport = require('../util/shouldElideDefaultExport'); var _shouldElideDefaultExport2 = _interopRequireDefault(_shouldElideDefaultExport); | ||||
| 
 | ||||
| 
 | ||||
| var _Transformer = require('./Transformer'); var _Transformer2 = _interopRequireDefault(_Transformer); | ||||
| 
 | ||||
| /** | ||||
|  * Class for editing import statements when we are transforming to commonjs. | ||||
|  */ | ||||
|  class CJSImportTransformer extends _Transformer2.default { | ||||
|    __init() {this.hadExport = false} | ||||
|    __init2() {this.hadNamedExport = false} | ||||
|    __init3() {this.hadDefaultExport = false} | ||||
|    | ||||
| 
 | ||||
|   constructor( | ||||
|      rootTransformer, | ||||
|      tokens, | ||||
|      importProcessor, | ||||
|      nameManager, | ||||
|      helperManager, | ||||
|      reactHotLoaderTransformer, | ||||
|      enableLegacyBabel5ModuleInterop, | ||||
|      enableLegacyTypeScriptModuleInterop, | ||||
|      isTypeScriptTransformEnabled, | ||||
|      isFlowTransformEnabled, | ||||
|      preserveDynamicImport, | ||||
|      keepUnusedImports, | ||||
|   ) { | ||||
|     super();this.rootTransformer = rootTransformer;this.tokens = tokens;this.importProcessor = importProcessor;this.nameManager = nameManager;this.helperManager = helperManager;this.reactHotLoaderTransformer = reactHotLoaderTransformer;this.enableLegacyBabel5ModuleInterop = enableLegacyBabel5ModuleInterop;this.enableLegacyTypeScriptModuleInterop = enableLegacyTypeScriptModuleInterop;this.isTypeScriptTransformEnabled = isTypeScriptTransformEnabled;this.isFlowTransformEnabled = isFlowTransformEnabled;this.preserveDynamicImport = preserveDynamicImport;this.keepUnusedImports = keepUnusedImports;CJSImportTransformer.prototype.__init.call(this);CJSImportTransformer.prototype.__init2.call(this);CJSImportTransformer.prototype.__init3.call(this);; | ||||
|     this.declarationInfo = isTypeScriptTransformEnabled | ||||
|       ? _getDeclarationInfo2.default.call(void 0, tokens) | ||||
|       : _getDeclarationInfo.EMPTY_DECLARATION_INFO; | ||||
|   } | ||||
| 
 | ||||
|   getPrefixCode() { | ||||
|     let prefix = ""; | ||||
|     if (this.hadExport) { | ||||
|       prefix += 'Object.defineProperty(exports, "__esModule", {value: true});'; | ||||
|     } | ||||
|     return prefix; | ||||
|   } | ||||
| 
 | ||||
|   getSuffixCode() { | ||||
|     if (this.enableLegacyBabel5ModuleInterop && this.hadDefaultExport && !this.hadNamedExport) { | ||||
|       return "\nmodule.exports = exports.default;\n"; | ||||
|     } | ||||
|     return ""; | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     // TypeScript `import foo = require('foo');` should always just be translated to plain require.
 | ||||
|     if (this.tokens.matches3(_types.TokenType._import, _types.TokenType.name, _types.TokenType.eq)) { | ||||
|       return this.processImportEquals(); | ||||
|     } | ||||
|     if (this.tokens.matches1(_types.TokenType._import)) { | ||||
|       this.processImport(); | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches2(_types.TokenType._export, _types.TokenType.eq)) { | ||||
|       this.tokens.replaceToken("module.exports"); | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches1(_types.TokenType._export) && !this.tokens.currentToken().isType) { | ||||
|       this.hadExport = true; | ||||
|       return this.processExport(); | ||||
|     } | ||||
|     if (this.tokens.matches2(_types.TokenType.name, _types.TokenType.postIncDec)) { | ||||
|       // Fall through to normal identifier matching if this doesn't apply.
 | ||||
|       if (this.processPostIncDec()) { | ||||
|         return true; | ||||
|       } | ||||
|     } | ||||
|     if (this.tokens.matches1(_types.TokenType.name) || this.tokens.matches1(_types.TokenType.jsxName)) { | ||||
|       return this.processIdentifier(); | ||||
|     } | ||||
|     if (this.tokens.matches1(_types.TokenType.eq)) { | ||||
|       return this.processAssignment(); | ||||
|     } | ||||
|     if (this.tokens.matches1(_types.TokenType.assign)) { | ||||
|       return this.processComplexAssignment(); | ||||
|     } | ||||
|     if (this.tokens.matches1(_types.TokenType.preIncDec)) { | ||||
|       return this.processPreIncDec(); | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|    processImportEquals() { | ||||
|     const importName = this.tokens.identifierNameAtIndex(this.tokens.currentIndex() + 1); | ||||
|     if (this.importProcessor.shouldAutomaticallyElideImportedName(importName)) { | ||||
|       // If this name is only used as a type, elide the whole import.
 | ||||
|       _elideImportEquals2.default.call(void 0, this.tokens); | ||||
|     } else { | ||||
|       // Otherwise, switch `import` to `const`.
 | ||||
|       this.tokens.replaceToken("const"); | ||||
|     } | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform this: | ||||
|    * import foo, {bar} from 'baz'; | ||||
|    * into | ||||
|    * var _baz = require('baz'); var _baz2 = _interopRequireDefault(_baz); | ||||
|    * | ||||
|    * The import code was already generated in the import preprocessing step, so | ||||
|    * we just need to look it up. | ||||
|    */ | ||||
|    processImport() { | ||||
|     if (this.tokens.matches2(_types.TokenType._import, _types.TokenType.parenL)) { | ||||
|       if (this.preserveDynamicImport) { | ||||
|         // Bail out, only making progress for this one token.
 | ||||
|         this.tokens.copyToken(); | ||||
|         return; | ||||
|       } | ||||
|       const requireWrapper = this.enableLegacyTypeScriptModuleInterop | ||||
|         ? "" | ||||
|         : `${this.helperManager.getHelperName("interopRequireWildcard")}(`; | ||||
|       this.tokens.replaceToken(`Promise.resolve().then(() => ${requireWrapper}require`); | ||||
|       const contextId = this.tokens.currentToken().contextId; | ||||
|       if (contextId == null) { | ||||
|         throw new Error("Expected context ID on dynamic import invocation."); | ||||
|       } | ||||
|       this.tokens.copyToken(); | ||||
|       while (!this.tokens.matchesContextIdAndLabel(_types.TokenType.parenR, contextId)) { | ||||
|         this.rootTransformer.processToken(); | ||||
|       } | ||||
|       this.tokens.replaceToken(requireWrapper ? ")))" : "))"); | ||||
|       return; | ||||
|     } | ||||
| 
 | ||||
|     const shouldElideImport = this.removeImportAndDetectIfShouldElide(); | ||||
|     if (shouldElideImport) { | ||||
|       this.tokens.removeToken(); | ||||
|     } else { | ||||
|       const path = this.tokens.stringValue(); | ||||
|       this.tokens.replaceTokenTrimmingLeftWhitespace(this.importProcessor.claimImportCode(path)); | ||||
|       this.tokens.appendCode(this.importProcessor.claimImportCode(path)); | ||||
|     } | ||||
|     _removeMaybeImportAttributes.removeMaybeImportAttributes.call(void 0, this.tokens); | ||||
|     if (this.tokens.matches1(_types.TokenType.semi)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Erase this import (since any CJS output would be completely different), and | ||||
|    * return true if this import is should be elided due to being a type-only | ||||
|    * import. Such imports will not be emitted at all to avoid side effects. | ||||
|    * | ||||
|    * Import elision only happens with the TypeScript or Flow transforms enabled. | ||||
|    * | ||||
|    * TODO: This function has some awkward overlap with | ||||
|    *  CJSImportProcessor.pruneTypeOnlyImports , and the two should be unified. | ||||
|    *  That function handles TypeScript implicit import name elision, and removes | ||||
|    *  an import if all typical imported names (without `type`) are removed due | ||||
|    *  to being type-only imports. This function handles Flow import removal and | ||||
|    *  properly distinguishes `import 'foo'` from `import {} from 'foo'` for TS | ||||
|    *  purposes. | ||||
|    * | ||||
|    * The position should end at the import string. | ||||
|    */ | ||||
|    removeImportAndDetectIfShouldElide() { | ||||
|     this.tokens.removeInitialToken(); | ||||
|     if ( | ||||
|       this.tokens.matchesContextual(_keywords.ContextualKeyword._type) && | ||||
|       !this.tokens.matches1AtIndex(this.tokens.currentIndex() + 1, _types.TokenType.comma) && | ||||
|       !this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 1, _keywords.ContextualKeyword._from) | ||||
|     ) { | ||||
|       // This is an "import type" statement, so exit early.
 | ||||
|       this.removeRemainingImport(); | ||||
|       return true; | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1(_types.TokenType.name) || this.tokens.matches1(_types.TokenType.star)) { | ||||
|       // We have a default import or namespace import, so there must be some
 | ||||
|       // non-type import.
 | ||||
|       this.removeRemainingImport(); | ||||
|       return false; | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1(_types.TokenType.string)) { | ||||
|       // This is a bare import, so we should proceed with the import.
 | ||||
|       return false; | ||||
|     } | ||||
| 
 | ||||
|     let foundNonTypeImport = false; | ||||
|     let foundAnyNamedImport = false; | ||||
|     while (!this.tokens.matches1(_types.TokenType.string)) { | ||||
|       // Check if any named imports are of the form "foo" or "foo as bar", with
 | ||||
|       // no leading "type".
 | ||||
|       if ( | ||||
|         (!foundNonTypeImport && this.tokens.matches1(_types.TokenType.braceL)) || | ||||
|         this.tokens.matches1(_types.TokenType.comma) | ||||
|       ) { | ||||
|         this.tokens.removeToken(); | ||||
|         if (!this.tokens.matches1(_types.TokenType.braceR)) { | ||||
|           foundAnyNamedImport = true; | ||||
|         } | ||||
|         if ( | ||||
|           this.tokens.matches2(_types.TokenType.name, _types.TokenType.comma) || | ||||
|           this.tokens.matches2(_types.TokenType.name, _types.TokenType.braceR) || | ||||
|           this.tokens.matches4(_types.TokenType.name, _types.TokenType.name, _types.TokenType.name, _types.TokenType.comma) || | ||||
|           this.tokens.matches4(_types.TokenType.name, _types.TokenType.name, _types.TokenType.name, _types.TokenType.braceR) | ||||
|         ) { | ||||
|           foundNonTypeImport = true; | ||||
|         } | ||||
|       } | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|     if (this.keepUnusedImports) { | ||||
|       return false; | ||||
|     } | ||||
|     if (this.isTypeScriptTransformEnabled) { | ||||
|       return !foundNonTypeImport; | ||||
|     } else if (this.isFlowTransformEnabled) { | ||||
|       // In Flow, unlike TS, `import {} from 'foo';` preserves the import.
 | ||||
|       return foundAnyNamedImport && !foundNonTypeImport; | ||||
|     } else { | ||||
|       return false; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    removeRemainingImport() { | ||||
|     while (!this.tokens.matches1(_types.TokenType.string)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    processIdentifier() { | ||||
|     const token = this.tokens.currentToken(); | ||||
|     if (token.shadowsGlobal) { | ||||
|       return false; | ||||
|     } | ||||
| 
 | ||||
|     if (token.identifierRole === _tokenizer.IdentifierRole.ObjectShorthand) { | ||||
|       return this.processObjectShorthand(); | ||||
|     } | ||||
| 
 | ||||
|     if (token.identifierRole !== _tokenizer.IdentifierRole.Access) { | ||||
|       return false; | ||||
|     } | ||||
|     const replacement = this.importProcessor.getIdentifierReplacement( | ||||
|       this.tokens.identifierNameForToken(token), | ||||
|     ); | ||||
|     if (!replacement) { | ||||
|       return false; | ||||
|     } | ||||
|     // Tolerate any number of closing parens while looking for an opening paren
 | ||||
|     // that indicates a function call.
 | ||||
|     let possibleOpenParenIndex = this.tokens.currentIndex() + 1; | ||||
|     while ( | ||||
|       possibleOpenParenIndex < this.tokens.tokens.length && | ||||
|       this.tokens.tokens[possibleOpenParenIndex].type === _types.TokenType.parenR | ||||
|     ) { | ||||
|       possibleOpenParenIndex++; | ||||
|     } | ||||
|     // Avoid treating imported functions as methods of their `exports` object
 | ||||
|     // by using `(0, f)` when the identifier is in a paren expression. Else
 | ||||
|     // use `Function.prototype.call` when the identifier is a guaranteed
 | ||||
|     // function call. When using `call`, pass undefined as the context.
 | ||||
|     if (this.tokens.tokens[possibleOpenParenIndex].type === _types.TokenType.parenL) { | ||||
|       if ( | ||||
|         this.tokens.tokenAtRelativeIndex(1).type === _types.TokenType.parenL && | ||||
|         this.tokens.tokenAtRelativeIndex(-1).type !== _types.TokenType._new | ||||
|       ) { | ||||
|         this.tokens.replaceToken(`${replacement}.call(void 0, `); | ||||
|         // Remove the old paren.
 | ||||
|         this.tokens.removeToken(); | ||||
|         // Balance out the new paren.
 | ||||
|         this.rootTransformer.processBalancedCode(); | ||||
|         this.tokens.copyExpectedToken(_types.TokenType.parenR); | ||||
|       } else { | ||||
|         // See here: http://2ality.com/2015/12/references.html
 | ||||
|         this.tokens.replaceToken(`(0, ${replacement})`); | ||||
|       } | ||||
|     } else { | ||||
|       this.tokens.replaceToken(replacement); | ||||
|     } | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   processObjectShorthand() { | ||||
|     const identifier = this.tokens.identifierName(); | ||||
|     const replacement = this.importProcessor.getIdentifierReplacement(identifier); | ||||
|     if (!replacement) { | ||||
|       return false; | ||||
|     } | ||||
|     this.tokens.replaceToken(`${identifier}: ${replacement}`); | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   processExport() { | ||||
|     if ( | ||||
|       this.tokens.matches2(_types.TokenType._export, _types.TokenType._enum) || | ||||
|       this.tokens.matches3(_types.TokenType._export, _types.TokenType._const, _types.TokenType._enum) | ||||
|     ) { | ||||
|       this.hadNamedExport = true; | ||||
|       // Let the TypeScript transform handle it.
 | ||||
|       return false; | ||||
|     } | ||||
|     if (this.tokens.matches2(_types.TokenType._export, _types.TokenType._default)) { | ||||
|       if (this.tokens.matches3(_types.TokenType._export, _types.TokenType._default, _types.TokenType._enum)) { | ||||
|         this.hadDefaultExport = true; | ||||
|         // Flow export default enums need some special handling, so handle them
 | ||||
|         // in that tranform rather than this one.
 | ||||
|         return false; | ||||
|       } | ||||
|       this.processExportDefault(); | ||||
|       return true; | ||||
|     } else if (this.tokens.matches2(_types.TokenType._export, _types.TokenType.braceL)) { | ||||
|       this.processExportBindings(); | ||||
|       return true; | ||||
|     } else if ( | ||||
|       this.tokens.matches2(_types.TokenType._export, _types.TokenType.name) && | ||||
|       this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 1, _keywords.ContextualKeyword._type) | ||||
|     ) { | ||||
|       // export type {a};
 | ||||
|       // export type {a as b};
 | ||||
|       // export type {a} from './b';
 | ||||
|       // export type * from './b';
 | ||||
|       // export type * as ns from './b';
 | ||||
|       this.tokens.removeInitialToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       if (this.tokens.matches1(_types.TokenType.braceL)) { | ||||
|         while (!this.tokens.matches1(_types.TokenType.braceR)) { | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|         this.tokens.removeToken(); | ||||
|       } else { | ||||
|         // *
 | ||||
|         this.tokens.removeToken(); | ||||
|         if (this.tokens.matches1(_types.TokenType._as)) { | ||||
|           // as
 | ||||
|           this.tokens.removeToken(); | ||||
|           // ns
 | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|       } | ||||
|       // Remove type re-export `... } from './T'`
 | ||||
|       if ( | ||||
|         this.tokens.matchesContextual(_keywords.ContextualKeyword._from) && | ||||
|         this.tokens.matches1AtIndex(this.tokens.currentIndex() + 1, _types.TokenType.string) | ||||
|       ) { | ||||
|         this.tokens.removeToken(); | ||||
|         this.tokens.removeToken(); | ||||
|         _removeMaybeImportAttributes.removeMaybeImportAttributes.call(void 0, this.tokens); | ||||
|       } | ||||
|       return true; | ||||
|     } | ||||
|     this.hadNamedExport = true; | ||||
|     if ( | ||||
|       this.tokens.matches2(_types.TokenType._export, _types.TokenType._var) || | ||||
|       this.tokens.matches2(_types.TokenType._export, _types.TokenType._let) || | ||||
|       this.tokens.matches2(_types.TokenType._export, _types.TokenType._const) | ||||
|     ) { | ||||
|       this.processExportVar(); | ||||
|       return true; | ||||
|     } else if ( | ||||
|       this.tokens.matches2(_types.TokenType._export, _types.TokenType._function) || | ||||
|       // export async function
 | ||||
|       this.tokens.matches3(_types.TokenType._export, _types.TokenType.name, _types.TokenType._function) | ||||
|     ) { | ||||
|       this.processExportFunction(); | ||||
|       return true; | ||||
|     } else if ( | ||||
|       this.tokens.matches2(_types.TokenType._export, _types.TokenType._class) || | ||||
|       this.tokens.matches3(_types.TokenType._export, _types.TokenType._abstract, _types.TokenType._class) || | ||||
|       this.tokens.matches2(_types.TokenType._export, _types.TokenType.at) | ||||
|     ) { | ||||
|       this.processExportClass(); | ||||
|       return true; | ||||
|     } else if (this.tokens.matches2(_types.TokenType._export, _types.TokenType.star)) { | ||||
|       this.processExportStar(); | ||||
|       return true; | ||||
|     } else { | ||||
|       throw new Error("Unrecognized export syntax."); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    processAssignment() { | ||||
|     const index = this.tokens.currentIndex(); | ||||
|     const identifierToken = this.tokens.tokens[index - 1]; | ||||
|     // If the LHS is a type identifier, this must be a declaration like `let a: b = c;`,
 | ||||
|     // with `b` as the identifier, so nothing needs to be done in that case.
 | ||||
|     if (identifierToken.isType || identifierToken.type !== _types.TokenType.name) { | ||||
|       return false; | ||||
|     } | ||||
|     if (identifierToken.shadowsGlobal) { | ||||
|       return false; | ||||
|     } | ||||
|     if (index >= 2 && this.tokens.matches1AtIndex(index - 2, _types.TokenType.dot)) { | ||||
|       return false; | ||||
|     } | ||||
|     if (index >= 2 && [_types.TokenType._var, _types.TokenType._let, _types.TokenType._const].includes(this.tokens.tokens[index - 2].type)) { | ||||
|       // Declarations don't need an extra assignment. This doesn't avoid the
 | ||||
|       // assignment for comma-separated declarations, but it's still correct
 | ||||
|       // since the assignment is just redundant.
 | ||||
|       return false; | ||||
|     } | ||||
|     const assignmentSnippet = this.importProcessor.resolveExportBinding( | ||||
|       this.tokens.identifierNameForToken(identifierToken), | ||||
|     ); | ||||
|     if (!assignmentSnippet) { | ||||
|       return false; | ||||
|     } | ||||
|     this.tokens.copyToken(); | ||||
|     this.tokens.appendCode(` ${assignmentSnippet} =`); | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Process something like `a += 3`, where `a` might be an exported value. | ||||
|    */ | ||||
|    processComplexAssignment() { | ||||
|     const index = this.tokens.currentIndex(); | ||||
|     const identifierToken = this.tokens.tokens[index - 1]; | ||||
|     if (identifierToken.type !== _types.TokenType.name) { | ||||
|       return false; | ||||
|     } | ||||
|     if (identifierToken.shadowsGlobal) { | ||||
|       return false; | ||||
|     } | ||||
|     if (index >= 2 && this.tokens.matches1AtIndex(index - 2, _types.TokenType.dot)) { | ||||
|       return false; | ||||
|     } | ||||
|     const assignmentSnippet = this.importProcessor.resolveExportBinding( | ||||
|       this.tokens.identifierNameForToken(identifierToken), | ||||
|     ); | ||||
|     if (!assignmentSnippet) { | ||||
|       return false; | ||||
|     } | ||||
|     this.tokens.appendCode(` = ${assignmentSnippet}`); | ||||
|     this.tokens.copyToken(); | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Process something like `++a`, where `a` might be an exported value. | ||||
|    */ | ||||
|    processPreIncDec() { | ||||
|     const index = this.tokens.currentIndex(); | ||||
|     const identifierToken = this.tokens.tokens[index + 1]; | ||||
|     if (identifierToken.type !== _types.TokenType.name) { | ||||
|       return false; | ||||
|     } | ||||
|     if (identifierToken.shadowsGlobal) { | ||||
|       return false; | ||||
|     } | ||||
|     // Ignore things like ++a.b and ++a[b] and ++a().b.
 | ||||
|     if ( | ||||
|       index + 2 < this.tokens.tokens.length && | ||||
|       (this.tokens.matches1AtIndex(index + 2, _types.TokenType.dot) || | ||||
|         this.tokens.matches1AtIndex(index + 2, _types.TokenType.bracketL) || | ||||
|         this.tokens.matches1AtIndex(index + 2, _types.TokenType.parenL)) | ||||
|     ) { | ||||
|       return false; | ||||
|     } | ||||
|     const identifierName = this.tokens.identifierNameForToken(identifierToken); | ||||
|     const assignmentSnippet = this.importProcessor.resolveExportBinding(identifierName); | ||||
|     if (!assignmentSnippet) { | ||||
|       return false; | ||||
|     } | ||||
|     this.tokens.appendCode(`${assignmentSnippet} = `); | ||||
|     this.tokens.copyToken(); | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Process something like `a++`, where `a` might be an exported value. | ||||
|    * This starts at the `a`, not at the `++`. | ||||
|    */ | ||||
|    processPostIncDec() { | ||||
|     const index = this.tokens.currentIndex(); | ||||
|     const identifierToken = this.tokens.tokens[index]; | ||||
|     const operatorToken = this.tokens.tokens[index + 1]; | ||||
|     if (identifierToken.type !== _types.TokenType.name) { | ||||
|       return false; | ||||
|     } | ||||
|     if (identifierToken.shadowsGlobal) { | ||||
|       return false; | ||||
|     } | ||||
|     if (index >= 1 && this.tokens.matches1AtIndex(index - 1, _types.TokenType.dot)) { | ||||
|       return false; | ||||
|     } | ||||
|     const identifierName = this.tokens.identifierNameForToken(identifierToken); | ||||
|     const assignmentSnippet = this.importProcessor.resolveExportBinding(identifierName); | ||||
|     if (!assignmentSnippet) { | ||||
|       return false; | ||||
|     } | ||||
|     const operatorCode = this.tokens.rawCodeForToken(operatorToken); | ||||
|     // We might also replace the identifier with something like exports.x, so
 | ||||
|     // do that replacement here as well.
 | ||||
|     const base = this.importProcessor.getIdentifierReplacement(identifierName) || identifierName; | ||||
|     if (operatorCode === "++") { | ||||
|       this.tokens.replaceToken(`(${base} = ${assignmentSnippet} = ${base} + 1, ${base} - 1)`); | ||||
|     } else if (operatorCode === "--") { | ||||
|       this.tokens.replaceToken(`(${base} = ${assignmentSnippet} = ${base} - 1, ${base} + 1)`); | ||||
|     } else { | ||||
|       throw new Error(`Unexpected operator: ${operatorCode}`); | ||||
|     } | ||||
|     this.tokens.removeToken(); | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|    processExportDefault() { | ||||
|     let exportedRuntimeValue = true; | ||||
|     if ( | ||||
|       this.tokens.matches4(_types.TokenType._export, _types.TokenType._default, _types.TokenType._function, _types.TokenType.name) || | ||||
|       // export default async function
 | ||||
|       (this.tokens.matches5(_types.TokenType._export, _types.TokenType._default, _types.TokenType.name, _types.TokenType._function, _types.TokenType.name) && | ||||
|         this.tokens.matchesContextualAtIndex( | ||||
|           this.tokens.currentIndex() + 2, | ||||
|           _keywords.ContextualKeyword._async, | ||||
|         )) | ||||
|     ) { | ||||
|       this.tokens.removeInitialToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       // Named function export case: change it to a top-level function
 | ||||
|       // declaration followed by exports statement.
 | ||||
|       const name = this.processNamedFunction(); | ||||
|       this.tokens.appendCode(` exports.default = ${name};`); | ||||
|     } else if ( | ||||
|       this.tokens.matches4(_types.TokenType._export, _types.TokenType._default, _types.TokenType._class, _types.TokenType.name) || | ||||
|       this.tokens.matches5(_types.TokenType._export, _types.TokenType._default, _types.TokenType._abstract, _types.TokenType._class, _types.TokenType.name) || | ||||
|       this.tokens.matches3(_types.TokenType._export, _types.TokenType._default, _types.TokenType.at) | ||||
|     ) { | ||||
|       this.tokens.removeInitialToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       this.copyDecorators(); | ||||
|       if (this.tokens.matches1(_types.TokenType._abstract)) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|       const name = this.rootTransformer.processNamedClass(); | ||||
|       this.tokens.appendCode(` exports.default = ${name};`); | ||||
|       // After this point, this is a plain "export default E" statement.
 | ||||
|     } else if ( | ||||
|       _shouldElideDefaultExport2.default.call(void 0,  | ||||
|         this.isTypeScriptTransformEnabled, | ||||
|         this.keepUnusedImports, | ||||
|         this.tokens, | ||||
|         this.declarationInfo, | ||||
|       ) | ||||
|     ) { | ||||
|       // If the exported value is just an identifier and should be elided by TypeScript
 | ||||
|       // rules, then remove it entirely. It will always have the form `export default e`,
 | ||||
|       // where `e` is an identifier.
 | ||||
|       exportedRuntimeValue = false; | ||||
|       this.tokens.removeInitialToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       this.tokens.removeToken(); | ||||
|     } else if (this.reactHotLoaderTransformer) { | ||||
|       // We need to assign E to a variable. Change "export default E" to
 | ||||
|       // "let _default; exports.default = _default = E"
 | ||||
|       const defaultVarName = this.nameManager.claimFreeName("_default"); | ||||
|       this.tokens.replaceToken(`let ${defaultVarName}; exports.`); | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.appendCode(` = ${defaultVarName} =`); | ||||
|       this.reactHotLoaderTransformer.setExtractedDefaultExportName(defaultVarName); | ||||
|     } else { | ||||
|       // Change "export default E" to "exports.default = E"
 | ||||
|       this.tokens.replaceToken("exports."); | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.appendCode(" ="); | ||||
|     } | ||||
|     if (exportedRuntimeValue) { | ||||
|       this.hadDefaultExport = true; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    copyDecorators() { | ||||
|     while (this.tokens.matches1(_types.TokenType.at)) { | ||||
|       this.tokens.copyToken(); | ||||
|       if (this.tokens.matches1(_types.TokenType.parenL)) { | ||||
|         this.tokens.copyExpectedToken(_types.TokenType.parenL); | ||||
|         this.rootTransformer.processBalancedCode(); | ||||
|         this.tokens.copyExpectedToken(_types.TokenType.parenR); | ||||
|       } else { | ||||
|         this.tokens.copyExpectedToken(_types.TokenType.name); | ||||
|         while (this.tokens.matches1(_types.TokenType.dot)) { | ||||
|           this.tokens.copyExpectedToken(_types.TokenType.dot); | ||||
|           this.tokens.copyExpectedToken(_types.TokenType.name); | ||||
|         } | ||||
|         if (this.tokens.matches1(_types.TokenType.parenL)) { | ||||
|           this.tokens.copyExpectedToken(_types.TokenType.parenL); | ||||
|           this.rootTransformer.processBalancedCode(); | ||||
|           this.tokens.copyExpectedToken(_types.TokenType.parenR); | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform a declaration like `export var`, `export let`, or `export const`. | ||||
|    */ | ||||
|    processExportVar() { | ||||
|     if (this.isSimpleExportVar()) { | ||||
|       this.processSimpleExportVar(); | ||||
|     } else { | ||||
|       this.processComplexExportVar(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Determine if the export is of the form: | ||||
|    * export var/let/const [varName] = [expr]; | ||||
|    * In other words, determine if function name inference might apply. | ||||
|    */ | ||||
|    isSimpleExportVar() { | ||||
|     let tokenIndex = this.tokens.currentIndex(); | ||||
|     // export
 | ||||
|     tokenIndex++; | ||||
|     // var/let/const
 | ||||
|     tokenIndex++; | ||||
|     if (!this.tokens.matches1AtIndex(tokenIndex, _types.TokenType.name)) { | ||||
|       return false; | ||||
|     } | ||||
|     tokenIndex++; | ||||
|     while (tokenIndex < this.tokens.tokens.length && this.tokens.tokens[tokenIndex].isType) { | ||||
|       tokenIndex++; | ||||
|     } | ||||
|     if (!this.tokens.matches1AtIndex(tokenIndex, _types.TokenType.eq)) { | ||||
|       return false; | ||||
|     } | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform an `export var` declaration initializing a single variable. | ||||
|    * | ||||
|    * For example, this: | ||||
|    * export const f = () => {}; | ||||
|    * becomes this: | ||||
|    * const f = () => {}; exports.f = f; | ||||
|    * | ||||
|    * The variable is unused (e.g. exports.f has the true value of the export). | ||||
|    * We need to produce an assignment of this form so that the function will | ||||
|    * have an inferred name of "f", which wouldn't happen in the more general | ||||
|    * case below. | ||||
|    */ | ||||
|    processSimpleExportVar() { | ||||
|     // export
 | ||||
|     this.tokens.removeInitialToken(); | ||||
|     // var/let/const
 | ||||
|     this.tokens.copyToken(); | ||||
|     const varName = this.tokens.identifierName(); | ||||
|     // x: number  ->  x
 | ||||
|     while (!this.tokens.matches1(_types.TokenType.eq)) { | ||||
|       this.rootTransformer.processToken(); | ||||
|     } | ||||
|     const endIndex = this.tokens.currentToken().rhsEndIndex; | ||||
|     if (endIndex == null) { | ||||
|       throw new Error("Expected = token with an end index."); | ||||
|     } | ||||
|     while (this.tokens.currentIndex() < endIndex) { | ||||
|       this.rootTransformer.processToken(); | ||||
|     } | ||||
|     this.tokens.appendCode(`; exports.${varName} = ${varName}`); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform normal declaration exports, including handling destructuring. | ||||
|    * For example, this: | ||||
|    * export const {x: [a = 2, b], c} = d; | ||||
|    * becomes this: | ||||
|    * ({x: [exports.a = 2, exports.b], c: exports.c} = d;) | ||||
|    */ | ||||
|    processComplexExportVar() { | ||||
|     this.tokens.removeInitialToken(); | ||||
|     this.tokens.removeToken(); | ||||
|     const needsParens = this.tokens.matches1(_types.TokenType.braceL); | ||||
|     if (needsParens) { | ||||
|       this.tokens.appendCode("("); | ||||
|     } | ||||
| 
 | ||||
|     let depth = 0; | ||||
|     while (true) { | ||||
|       if ( | ||||
|         this.tokens.matches1(_types.TokenType.braceL) || | ||||
|         this.tokens.matches1(_types.TokenType.dollarBraceL) || | ||||
|         this.tokens.matches1(_types.TokenType.bracketL) | ||||
|       ) { | ||||
|         depth++; | ||||
|         this.tokens.copyToken(); | ||||
|       } else if (this.tokens.matches1(_types.TokenType.braceR) || this.tokens.matches1(_types.TokenType.bracketR)) { | ||||
|         depth--; | ||||
|         this.tokens.copyToken(); | ||||
|       } else if ( | ||||
|         depth === 0 && | ||||
|         !this.tokens.matches1(_types.TokenType.name) && | ||||
|         !this.tokens.currentToken().isType | ||||
|       ) { | ||||
|         break; | ||||
|       } else if (this.tokens.matches1(_types.TokenType.eq)) { | ||||
|         // Default values might have assignments in the RHS that we want to ignore, so skip past
 | ||||
|         // them.
 | ||||
|         const endIndex = this.tokens.currentToken().rhsEndIndex; | ||||
|         if (endIndex == null) { | ||||
|           throw new Error("Expected = token with an end index."); | ||||
|         } | ||||
|         while (this.tokens.currentIndex() < endIndex) { | ||||
|           this.rootTransformer.processToken(); | ||||
|         } | ||||
|       } else { | ||||
|         const token = this.tokens.currentToken(); | ||||
|         if (_tokenizer.isDeclaration.call(void 0, token)) { | ||||
|           const name = this.tokens.identifierName(); | ||||
|           let replacement = this.importProcessor.getIdentifierReplacement(name); | ||||
|           if (replacement === null) { | ||||
|             throw new Error(`Expected a replacement for ${name} in \`export var\` syntax.`); | ||||
|           } | ||||
|           if (_tokenizer.isObjectShorthandDeclaration.call(void 0, token)) { | ||||
|             replacement = `${name}: ${replacement}`; | ||||
|           } | ||||
|           this.tokens.replaceToken(replacement); | ||||
|         } else { | ||||
|           this.rootTransformer.processToken(); | ||||
|         } | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     if (needsParens) { | ||||
|       // Seek to the end of the RHS.
 | ||||
|       const endIndex = this.tokens.currentToken().rhsEndIndex; | ||||
|       if (endIndex == null) { | ||||
|         throw new Error("Expected = token with an end index."); | ||||
|       } | ||||
|       while (this.tokens.currentIndex() < endIndex) { | ||||
|         this.rootTransformer.processToken(); | ||||
|       } | ||||
|       this.tokens.appendCode(")"); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform this: | ||||
|    * export function foo() {} | ||||
|    * into this: | ||||
|    * function foo() {} exports.foo = foo; | ||||
|    */ | ||||
|    processExportFunction() { | ||||
|     this.tokens.replaceToken(""); | ||||
|     const name = this.processNamedFunction(); | ||||
|     this.tokens.appendCode(` exports.${name} = ${name};`); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Skip past a function with a name and return that name. | ||||
|    */ | ||||
|    processNamedFunction() { | ||||
|     if (this.tokens.matches1(_types.TokenType._function)) { | ||||
|       this.tokens.copyToken(); | ||||
|     } else if (this.tokens.matches2(_types.TokenType.name, _types.TokenType._function)) { | ||||
|       if (!this.tokens.matchesContextual(_keywords.ContextualKeyword._async)) { | ||||
|         throw new Error("Expected async keyword in function export."); | ||||
|       } | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.copyToken(); | ||||
|     } | ||||
|     if (this.tokens.matches1(_types.TokenType.star)) { | ||||
|       this.tokens.copyToken(); | ||||
|     } | ||||
|     if (!this.tokens.matches1(_types.TokenType.name)) { | ||||
|       throw new Error("Expected identifier for exported function name."); | ||||
|     } | ||||
|     const name = this.tokens.identifierName(); | ||||
|     this.tokens.copyToken(); | ||||
|     if (this.tokens.currentToken().isType) { | ||||
|       this.tokens.removeInitialToken(); | ||||
|       while (this.tokens.currentToken().isType) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|     } | ||||
|     this.tokens.copyExpectedToken(_types.TokenType.parenL); | ||||
|     this.rootTransformer.processBalancedCode(); | ||||
|     this.tokens.copyExpectedToken(_types.TokenType.parenR); | ||||
|     this.rootTransformer.processPossibleTypeRange(); | ||||
|     this.tokens.copyExpectedToken(_types.TokenType.braceL); | ||||
|     this.rootTransformer.processBalancedCode(); | ||||
|     this.tokens.copyExpectedToken(_types.TokenType.braceR); | ||||
|     return name; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform this: | ||||
|    * export class A {} | ||||
|    * into this: | ||||
|    * class A {} exports.A = A; | ||||
|    */ | ||||
|    processExportClass() { | ||||
|     this.tokens.removeInitialToken(); | ||||
|     this.copyDecorators(); | ||||
|     if (this.tokens.matches1(_types.TokenType._abstract)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|     const name = this.rootTransformer.processNamedClass(); | ||||
|     this.tokens.appendCode(` exports.${name} = ${name};`); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform this: | ||||
|    * export {a, b as c}; | ||||
|    * into this: | ||||
|    * exports.a = a; exports.c = b; | ||||
|    * | ||||
|    * OR | ||||
|    * | ||||
|    * Transform this: | ||||
|    * export {a, b as c} from './foo'; | ||||
|    * into the pre-generated Object.defineProperty code from the ImportProcessor. | ||||
|    * | ||||
|    * For the first case, if the TypeScript transform is enabled, we need to skip | ||||
|    * exports that are only defined as types. | ||||
|    */ | ||||
|    processExportBindings() { | ||||
|     this.tokens.removeInitialToken(); | ||||
|     this.tokens.removeToken(); | ||||
| 
 | ||||
|     const isReExport = _isExportFrom2.default.call(void 0, this.tokens); | ||||
| 
 | ||||
|     const exportStatements = []; | ||||
|     while (true) { | ||||
|       if (this.tokens.matches1(_types.TokenType.braceR)) { | ||||
|         this.tokens.removeToken(); | ||||
|         break; | ||||
|       } | ||||
| 
 | ||||
|       const specifierInfo = _getImportExportSpecifierInfo2.default.call(void 0, this.tokens); | ||||
| 
 | ||||
|       while (this.tokens.currentIndex() < specifierInfo.endIndex) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
| 
 | ||||
|       const shouldRemoveExport = | ||||
|         specifierInfo.isType || | ||||
|         (!isReExport && this.shouldElideExportedIdentifier(specifierInfo.leftName)); | ||||
|       if (!shouldRemoveExport) { | ||||
|         const exportedName = specifierInfo.rightName; | ||||
|         if (exportedName === "default") { | ||||
|           this.hadDefaultExport = true; | ||||
|         } else { | ||||
|           this.hadNamedExport = true; | ||||
|         } | ||||
|         const localName = specifierInfo.leftName; | ||||
|         const newLocalName = this.importProcessor.getIdentifierReplacement(localName); | ||||
|         exportStatements.push(`exports.${exportedName} = ${newLocalName || localName};`); | ||||
|       } | ||||
| 
 | ||||
|       if (this.tokens.matches1(_types.TokenType.braceR)) { | ||||
|         this.tokens.removeToken(); | ||||
|         break; | ||||
|       } | ||||
|       if (this.tokens.matches2(_types.TokenType.comma, _types.TokenType.braceR)) { | ||||
|         this.tokens.removeToken(); | ||||
|         this.tokens.removeToken(); | ||||
|         break; | ||||
|       } else if (this.tokens.matches1(_types.TokenType.comma)) { | ||||
|         this.tokens.removeToken(); | ||||
|       } else { | ||||
|         throw new Error(`Unexpected token: ${JSON.stringify(this.tokens.currentToken())}`); | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matchesContextual(_keywords.ContextualKeyword._from)) { | ||||
|       // This is an export...from, so throw away the normal named export code
 | ||||
|       // and use the Object.defineProperty code from ImportProcessor.
 | ||||
|       this.tokens.removeToken(); | ||||
|       const path = this.tokens.stringValue(); | ||||
|       this.tokens.replaceTokenTrimmingLeftWhitespace(this.importProcessor.claimImportCode(path)); | ||||
|       _removeMaybeImportAttributes.removeMaybeImportAttributes.call(void 0, this.tokens); | ||||
|     } else { | ||||
|       // This is a normal named export, so use that.
 | ||||
|       this.tokens.appendCode(exportStatements.join(" ")); | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1(_types.TokenType.semi)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    processExportStar() { | ||||
|     this.tokens.removeInitialToken(); | ||||
|     while (!this.tokens.matches1(_types.TokenType.string)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|     const path = this.tokens.stringValue(); | ||||
|     this.tokens.replaceTokenTrimmingLeftWhitespace(this.importProcessor.claimImportCode(path)); | ||||
|     _removeMaybeImportAttributes.removeMaybeImportAttributes.call(void 0, this.tokens); | ||||
|     if (this.tokens.matches1(_types.TokenType.semi)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    shouldElideExportedIdentifier(name) { | ||||
|     return ( | ||||
|       this.isTypeScriptTransformEnabled && | ||||
|       !this.keepUnusedImports && | ||||
|       !this.declarationInfo.valueDeclarations.has(name) | ||||
|     ); | ||||
|   } | ||||
| } exports.default = CJSImportTransformer; | ||||
							
								
								
									
										415
									
								
								node_modules/sucrase/dist/transformers/ESMImportTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										415
									
								
								node_modules/sucrase/dist/transformers/ESMImportTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,415 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; } | ||||
| 
 | ||||
| 
 | ||||
| var _keywords = require('../parser/tokenizer/keywords'); | ||||
| var _types = require('../parser/tokenizer/types'); | ||||
| 
 | ||||
| var _elideImportEquals = require('../util/elideImportEquals'); var _elideImportEquals2 = _interopRequireDefault(_elideImportEquals); | ||||
| 
 | ||||
| 
 | ||||
| 
 | ||||
| var _getDeclarationInfo = require('../util/getDeclarationInfo'); var _getDeclarationInfo2 = _interopRequireDefault(_getDeclarationInfo); | ||||
| var _getImportExportSpecifierInfo = require('../util/getImportExportSpecifierInfo'); var _getImportExportSpecifierInfo2 = _interopRequireDefault(_getImportExportSpecifierInfo); | ||||
| var _getNonTypeIdentifiers = require('../util/getNonTypeIdentifiers'); | ||||
| var _isExportFrom = require('../util/isExportFrom'); var _isExportFrom2 = _interopRequireDefault(_isExportFrom); | ||||
| var _removeMaybeImportAttributes = require('../util/removeMaybeImportAttributes'); | ||||
| var _shouldElideDefaultExport = require('../util/shouldElideDefaultExport'); var _shouldElideDefaultExport2 = _interopRequireDefault(_shouldElideDefaultExport); | ||||
| 
 | ||||
| var _Transformer = require('./Transformer'); var _Transformer2 = _interopRequireDefault(_Transformer); | ||||
| 
 | ||||
| /** | ||||
|  * Class for editing import statements when we are keeping the code as ESM. We still need to remove | ||||
|  * type-only imports in TypeScript and Flow. | ||||
|  */ | ||||
|  class ESMImportTransformer extends _Transformer2.default { | ||||
|    | ||||
|    | ||||
|    | ||||
| 
 | ||||
|   constructor( | ||||
|      tokens, | ||||
|      nameManager, | ||||
|      helperManager, | ||||
|      reactHotLoaderTransformer, | ||||
|      isTypeScriptTransformEnabled, | ||||
|      isFlowTransformEnabled, | ||||
|      keepUnusedImports, | ||||
|     options, | ||||
|   ) { | ||||
|     super();this.tokens = tokens;this.nameManager = nameManager;this.helperManager = helperManager;this.reactHotLoaderTransformer = reactHotLoaderTransformer;this.isTypeScriptTransformEnabled = isTypeScriptTransformEnabled;this.isFlowTransformEnabled = isFlowTransformEnabled;this.keepUnusedImports = keepUnusedImports;; | ||||
|     this.nonTypeIdentifiers = | ||||
|       isTypeScriptTransformEnabled && !keepUnusedImports | ||||
|         ? _getNonTypeIdentifiers.getNonTypeIdentifiers.call(void 0, tokens, options) | ||||
|         : new Set(); | ||||
|     this.declarationInfo = | ||||
|       isTypeScriptTransformEnabled && !keepUnusedImports | ||||
|         ? _getDeclarationInfo2.default.call(void 0, tokens) | ||||
|         : _getDeclarationInfo.EMPTY_DECLARATION_INFO; | ||||
|     this.injectCreateRequireForImportRequire = Boolean(options.injectCreateRequireForImportRequire); | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     // TypeScript `import foo = require('foo');` should always just be translated to plain require.
 | ||||
|     if (this.tokens.matches3(_types.TokenType._import, _types.TokenType.name, _types.TokenType.eq)) { | ||||
|       return this.processImportEquals(); | ||||
|     } | ||||
|     if ( | ||||
|       this.tokens.matches4(_types.TokenType._import, _types.TokenType.name, _types.TokenType.name, _types.TokenType.eq) && | ||||
|       this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 1, _keywords.ContextualKeyword._type) | ||||
|     ) { | ||||
|       // import type T = require('T')
 | ||||
|       this.tokens.removeInitialToken(); | ||||
|       // This construct is always exactly 8 tokens long, so remove the 7 remaining tokens.
 | ||||
|       for (let i = 0; i < 7; i++) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches2(_types.TokenType._export, _types.TokenType.eq)) { | ||||
|       this.tokens.replaceToken("module.exports"); | ||||
|       return true; | ||||
|     } | ||||
|     if ( | ||||
|       this.tokens.matches5(_types.TokenType._export, _types.TokenType._import, _types.TokenType.name, _types.TokenType.name, _types.TokenType.eq) && | ||||
|       this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 2, _keywords.ContextualKeyword._type) | ||||
|     ) { | ||||
|       // export import type T = require('T')
 | ||||
|       this.tokens.removeInitialToken(); | ||||
|       // This construct is always exactly 9 tokens long, so remove the 8 remaining tokens.
 | ||||
|       for (let i = 0; i < 8; i++) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches1(_types.TokenType._import)) { | ||||
|       return this.processImport(); | ||||
|     } | ||||
|     if (this.tokens.matches2(_types.TokenType._export, _types.TokenType._default)) { | ||||
|       return this.processExportDefault(); | ||||
|     } | ||||
|     if (this.tokens.matches2(_types.TokenType._export, _types.TokenType.braceL)) { | ||||
|       return this.processNamedExports(); | ||||
|     } | ||||
|     if ( | ||||
|       this.tokens.matches2(_types.TokenType._export, _types.TokenType.name) && | ||||
|       this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 1, _keywords.ContextualKeyword._type) | ||||
|     ) { | ||||
|       // export type {a};
 | ||||
|       // export type {a as b};
 | ||||
|       // export type {a} from './b';
 | ||||
|       // export type * from './b';
 | ||||
|       // export type * as ns from './b';
 | ||||
|       this.tokens.removeInitialToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       if (this.tokens.matches1(_types.TokenType.braceL)) { | ||||
|         while (!this.tokens.matches1(_types.TokenType.braceR)) { | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|         this.tokens.removeToken(); | ||||
|       } else { | ||||
|         // *
 | ||||
|         this.tokens.removeToken(); | ||||
|         if (this.tokens.matches1(_types.TokenType._as)) { | ||||
|           // as
 | ||||
|           this.tokens.removeToken(); | ||||
|           // ns
 | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|       } | ||||
|       // Remove type re-export `... } from './T'`
 | ||||
|       if ( | ||||
|         this.tokens.matchesContextual(_keywords.ContextualKeyword._from) && | ||||
|         this.tokens.matches1AtIndex(this.tokens.currentIndex() + 1, _types.TokenType.string) | ||||
|       ) { | ||||
|         this.tokens.removeToken(); | ||||
|         this.tokens.removeToken(); | ||||
|         _removeMaybeImportAttributes.removeMaybeImportAttributes.call(void 0, this.tokens); | ||||
|       } | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|    processImportEquals() { | ||||
|     const importName = this.tokens.identifierNameAtIndex(this.tokens.currentIndex() + 1); | ||||
|     if (this.shouldAutomaticallyElideImportedName(importName)) { | ||||
|       // If this name is only used as a type, elide the whole import.
 | ||||
|       _elideImportEquals2.default.call(void 0, this.tokens); | ||||
|     } else if (this.injectCreateRequireForImportRequire) { | ||||
|       // We're using require in an environment (Node ESM) that doesn't provide
 | ||||
|       // it as a global, so generate a helper to import it.
 | ||||
|       // import -> const
 | ||||
|       this.tokens.replaceToken("const"); | ||||
|       // Foo
 | ||||
|       this.tokens.copyToken(); | ||||
|       // =
 | ||||
|       this.tokens.copyToken(); | ||||
|       // require
 | ||||
|       this.tokens.replaceToken(this.helperManager.getHelperName("require")); | ||||
|     } else { | ||||
|       // Otherwise, just switch `import` to `const`.
 | ||||
|       this.tokens.replaceToken("const"); | ||||
|     } | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|    processImport() { | ||||
|     if (this.tokens.matches2(_types.TokenType._import, _types.TokenType.parenL)) { | ||||
|       // Dynamic imports don't need to be transformed.
 | ||||
|       return false; | ||||
|     } | ||||
| 
 | ||||
|     const snapshot = this.tokens.snapshot(); | ||||
|     const allImportsRemoved = this.removeImportTypeBindings(); | ||||
|     if (allImportsRemoved) { | ||||
|       this.tokens.restoreToSnapshot(snapshot); | ||||
|       while (!this.tokens.matches1(_types.TokenType.string)) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|       this.tokens.removeToken(); | ||||
|       _removeMaybeImportAttributes.removeMaybeImportAttributes.call(void 0, this.tokens); | ||||
|       if (this.tokens.matches1(_types.TokenType.semi)) { | ||||
|         this.tokens.removeToken(); | ||||
|       } | ||||
|     } | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Remove type bindings from this import, leaving the rest of the import intact. | ||||
|    * | ||||
|    * Return true if this import was ONLY types, and thus is eligible for removal. This will bail out | ||||
|    * of the replacement operation, so we can return early here. | ||||
|    */ | ||||
|    removeImportTypeBindings() { | ||||
|     this.tokens.copyExpectedToken(_types.TokenType._import); | ||||
|     if ( | ||||
|       this.tokens.matchesContextual(_keywords.ContextualKeyword._type) && | ||||
|       !this.tokens.matches1AtIndex(this.tokens.currentIndex() + 1, _types.TokenType.comma) && | ||||
|       !this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 1, _keywords.ContextualKeyword._from) | ||||
|     ) { | ||||
|       // This is an "import type" statement, so exit early.
 | ||||
|       return true; | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1(_types.TokenType.string)) { | ||||
|       // This is a bare import, so we should proceed with the import.
 | ||||
|       this.tokens.copyToken(); | ||||
|       return false; | ||||
|     } | ||||
| 
 | ||||
|     // Skip the "module" token in import reflection.
 | ||||
|     if ( | ||||
|       this.tokens.matchesContextual(_keywords.ContextualKeyword._module) && | ||||
|       this.tokens.matchesContextualAtIndex(this.tokens.currentIndex() + 2, _keywords.ContextualKeyword._from) | ||||
|     ) { | ||||
|       this.tokens.copyToken(); | ||||
|     } | ||||
| 
 | ||||
|     let foundNonTypeImport = false; | ||||
|     let foundAnyNamedImport = false; | ||||
|     let needsComma = false; | ||||
| 
 | ||||
|     // Handle default import.
 | ||||
|     if (this.tokens.matches1(_types.TokenType.name)) { | ||||
|       if (this.shouldAutomaticallyElideImportedName(this.tokens.identifierName())) { | ||||
|         this.tokens.removeToken(); | ||||
|         if (this.tokens.matches1(_types.TokenType.comma)) { | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|       } else { | ||||
|         foundNonTypeImport = true; | ||||
|         this.tokens.copyToken(); | ||||
|         if (this.tokens.matches1(_types.TokenType.comma)) { | ||||
|           // We're in a statement like:
 | ||||
|           // import A, * as B from './A';
 | ||||
|           // or
 | ||||
|           // import A, {foo} from './A';
 | ||||
|           // where the `A` is being kept. The comma should be removed if an only
 | ||||
|           // if the next part of the import statement is elided, but that's hard
 | ||||
|           // to determine at this point in the code. Instead, always remove it
 | ||||
|           // and set a flag to add it back if necessary.
 | ||||
|           needsComma = true; | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     if (this.tokens.matches1(_types.TokenType.star)) { | ||||
|       if (this.shouldAutomaticallyElideImportedName(this.tokens.identifierNameAtRelativeIndex(2))) { | ||||
|         this.tokens.removeToken(); | ||||
|         this.tokens.removeToken(); | ||||
|         this.tokens.removeToken(); | ||||
|       } else { | ||||
|         if (needsComma) { | ||||
|           this.tokens.appendCode(","); | ||||
|         } | ||||
|         foundNonTypeImport = true; | ||||
|         this.tokens.copyExpectedToken(_types.TokenType.star); | ||||
|         this.tokens.copyExpectedToken(_types.TokenType.name); | ||||
|         this.tokens.copyExpectedToken(_types.TokenType.name); | ||||
|       } | ||||
|     } else if (this.tokens.matches1(_types.TokenType.braceL)) { | ||||
|       if (needsComma) { | ||||
|         this.tokens.appendCode(","); | ||||
|       } | ||||
|       this.tokens.copyToken(); | ||||
|       while (!this.tokens.matches1(_types.TokenType.braceR)) { | ||||
|         foundAnyNamedImport = true; | ||||
|         const specifierInfo = _getImportExportSpecifierInfo2.default.call(void 0, this.tokens); | ||||
|         if ( | ||||
|           specifierInfo.isType || | ||||
|           this.shouldAutomaticallyElideImportedName(specifierInfo.rightName) | ||||
|         ) { | ||||
|           while (this.tokens.currentIndex() < specifierInfo.endIndex) { | ||||
|             this.tokens.removeToken(); | ||||
|           } | ||||
|           if (this.tokens.matches1(_types.TokenType.comma)) { | ||||
|             this.tokens.removeToken(); | ||||
|           } | ||||
|         } else { | ||||
|           foundNonTypeImport = true; | ||||
|           while (this.tokens.currentIndex() < specifierInfo.endIndex) { | ||||
|             this.tokens.copyToken(); | ||||
|           } | ||||
|           if (this.tokens.matches1(_types.TokenType.comma)) { | ||||
|             this.tokens.copyToken(); | ||||
|           } | ||||
|         } | ||||
|       } | ||||
|       this.tokens.copyExpectedToken(_types.TokenType.braceR); | ||||
|     } | ||||
| 
 | ||||
|     if (this.keepUnusedImports) { | ||||
|       return false; | ||||
|     } | ||||
|     if (this.isTypeScriptTransformEnabled) { | ||||
|       return !foundNonTypeImport; | ||||
|     } else if (this.isFlowTransformEnabled) { | ||||
|       // In Flow, unlike TS, `import {} from 'foo';` preserves the import.
 | ||||
|       return foundAnyNamedImport && !foundNonTypeImport; | ||||
|     } else { | ||||
|       return false; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|    shouldAutomaticallyElideImportedName(name) { | ||||
|     return ( | ||||
|       this.isTypeScriptTransformEnabled && | ||||
|       !this.keepUnusedImports && | ||||
|       !this.nonTypeIdentifiers.has(name) | ||||
|     ); | ||||
|   } | ||||
| 
 | ||||
|    processExportDefault() { | ||||
|     if ( | ||||
|       _shouldElideDefaultExport2.default.call(void 0,  | ||||
|         this.isTypeScriptTransformEnabled, | ||||
|         this.keepUnusedImports, | ||||
|         this.tokens, | ||||
|         this.declarationInfo, | ||||
|       ) | ||||
|     ) { | ||||
|       // If the exported value is just an identifier and should be elided by TypeScript
 | ||||
|       // rules, then remove it entirely. It will always have the form `export default e`,
 | ||||
|       // where `e` is an identifier.
 | ||||
|       this.tokens.removeInitialToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       return true; | ||||
|     } | ||||
| 
 | ||||
|     const alreadyHasName = | ||||
|       this.tokens.matches4(_types.TokenType._export, _types.TokenType._default, _types.TokenType._function, _types.TokenType.name) || | ||||
|       // export default async function
 | ||||
|       (this.tokens.matches5(_types.TokenType._export, _types.TokenType._default, _types.TokenType.name, _types.TokenType._function, _types.TokenType.name) && | ||||
|         this.tokens.matchesContextualAtIndex( | ||||
|           this.tokens.currentIndex() + 2, | ||||
|           _keywords.ContextualKeyword._async, | ||||
|         )) || | ||||
|       this.tokens.matches4(_types.TokenType._export, _types.TokenType._default, _types.TokenType._class, _types.TokenType.name) || | ||||
|       this.tokens.matches5(_types.TokenType._export, _types.TokenType._default, _types.TokenType._abstract, _types.TokenType._class, _types.TokenType.name); | ||||
| 
 | ||||
|     if (!alreadyHasName && this.reactHotLoaderTransformer) { | ||||
|       // This is a plain "export default E" statement and we need to assign E to a variable.
 | ||||
|       // Change "export default E" to "let _default; export default _default = E"
 | ||||
|       const defaultVarName = this.nameManager.claimFreeName("_default"); | ||||
|       this.tokens.replaceToken(`let ${defaultVarName}; export`); | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.appendCode(` ${defaultVarName} =`); | ||||
|       this.reactHotLoaderTransformer.setExtractedDefaultExportName(defaultVarName); | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Handle a statement with one of these forms: | ||||
|    * export {a, type b}; | ||||
|    * export {c, type d} from 'foo'; | ||||
|    * | ||||
|    * In both cases, any explicit type exports should be removed. In the first | ||||
|    * case, we also need to handle implicit export elision for names declared as | ||||
|    * types. In the second case, we must NOT do implicit named export elision, | ||||
|    * but we must remove the runtime import if all exports are type exports. | ||||
|    */ | ||||
|    processNamedExports() { | ||||
|     if (!this.isTypeScriptTransformEnabled) { | ||||
|       return false; | ||||
|     } | ||||
|     this.tokens.copyExpectedToken(_types.TokenType._export); | ||||
|     this.tokens.copyExpectedToken(_types.TokenType.braceL); | ||||
| 
 | ||||
|     const isReExport = _isExportFrom2.default.call(void 0, this.tokens); | ||||
|     let foundNonTypeExport = false; | ||||
|     while (!this.tokens.matches1(_types.TokenType.braceR)) { | ||||
|       const specifierInfo = _getImportExportSpecifierInfo2.default.call(void 0, this.tokens); | ||||
|       if ( | ||||
|         specifierInfo.isType || | ||||
|         (!isReExport && this.shouldElideExportedName(specifierInfo.leftName)) | ||||
|       ) { | ||||
|         // Type export, so remove all tokens, including any comma.
 | ||||
|         while (this.tokens.currentIndex() < specifierInfo.endIndex) { | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|         if (this.tokens.matches1(_types.TokenType.comma)) { | ||||
|           this.tokens.removeToken(); | ||||
|         } | ||||
|       } else { | ||||
|         // Non-type export, so copy all tokens, including any comma.
 | ||||
|         foundNonTypeExport = true; | ||||
|         while (this.tokens.currentIndex() < specifierInfo.endIndex) { | ||||
|           this.tokens.copyToken(); | ||||
|         } | ||||
|         if (this.tokens.matches1(_types.TokenType.comma)) { | ||||
|           this.tokens.copyToken(); | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|     this.tokens.copyExpectedToken(_types.TokenType.braceR); | ||||
| 
 | ||||
|     if (!this.keepUnusedImports && isReExport && !foundNonTypeExport) { | ||||
|       // This is a type-only re-export, so skip evaluating the other module. Technically this
 | ||||
|       // leaves the statement as `export {}`, but that's ok since that's a no-op.
 | ||||
|       this.tokens.removeToken(); | ||||
|       this.tokens.removeToken(); | ||||
|       _removeMaybeImportAttributes.removeMaybeImportAttributes.call(void 0, this.tokens); | ||||
|     } | ||||
| 
 | ||||
|     return true; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * ESM elides all imports with the rule that we only elide if we see that it's | ||||
|    * a type and never see it as a value. This is in contrast to CJS, which | ||||
|    * elides imports that are completely unknown. | ||||
|    */ | ||||
|    shouldElideExportedName(name) { | ||||
|     return ( | ||||
|       this.isTypeScriptTransformEnabled && | ||||
|       !this.keepUnusedImports && | ||||
|       this.declarationInfo.typeDeclarations.has(name) && | ||||
|       !this.declarationInfo.valueDeclarations.has(name) | ||||
|     ); | ||||
|   } | ||||
| } exports.default = ESMImportTransformer; | ||||
							
								
								
									
										182
									
								
								node_modules/sucrase/dist/transformers/FlowTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										182
									
								
								node_modules/sucrase/dist/transformers/FlowTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,182 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; }var _keywords = require('../parser/tokenizer/keywords'); | ||||
| var _types = require('../parser/tokenizer/types'); | ||||
| 
 | ||||
| 
 | ||||
| var _Transformer = require('./Transformer'); var _Transformer2 = _interopRequireDefault(_Transformer); | ||||
| 
 | ||||
|  class FlowTransformer extends _Transformer2.default { | ||||
|   constructor( | ||||
|      rootTransformer, | ||||
|      tokens, | ||||
|      isImportsTransformEnabled, | ||||
|   ) { | ||||
|     super();this.rootTransformer = rootTransformer;this.tokens = tokens;this.isImportsTransformEnabled = isImportsTransformEnabled;; | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     if ( | ||||
|       this.rootTransformer.processPossibleArrowParamEnd() || | ||||
|       this.rootTransformer.processPossibleAsyncArrowWithTypeParams() || | ||||
|       this.rootTransformer.processPossibleTypeRange() | ||||
|     ) { | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches1(_types.TokenType._enum)) { | ||||
|       this.processEnum(); | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches2(_types.TokenType._export, _types.TokenType._enum)) { | ||||
|       this.processNamedExportEnum(); | ||||
|       return true; | ||||
|     } | ||||
|     if (this.tokens.matches3(_types.TokenType._export, _types.TokenType._default, _types.TokenType._enum)) { | ||||
|       this.processDefaultExportEnum(); | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Handle a declaration like: | ||||
|    * export enum E ... | ||||
|    * | ||||
|    * With this imports transform, this becomes: | ||||
|    * const E = [[enum]]; exports.E = E; | ||||
|    * | ||||
|    * otherwise, it becomes: | ||||
|    * export const E = [[enum]]; | ||||
|    */ | ||||
|   processNamedExportEnum() { | ||||
|     if (this.isImportsTransformEnabled) { | ||||
|       // export
 | ||||
|       this.tokens.removeInitialToken(); | ||||
|       const enumName = this.tokens.identifierNameAtRelativeIndex(1); | ||||
|       this.processEnum(); | ||||
|       this.tokens.appendCode(` exports.${enumName} = ${enumName};`); | ||||
|     } else { | ||||
|       this.tokens.copyToken(); | ||||
|       this.processEnum(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Handle a declaration like: | ||||
|    * export default enum E | ||||
|    * | ||||
|    * With the imports transform, this becomes: | ||||
|    * const E = [[enum]]; exports.default = E; | ||||
|    * | ||||
|    * otherwise, it becomes: | ||||
|    * const E = [[enum]]; export default E; | ||||
|    */ | ||||
|   processDefaultExportEnum() { | ||||
|     // export
 | ||||
|     this.tokens.removeInitialToken(); | ||||
|     // default
 | ||||
|     this.tokens.removeToken(); | ||||
|     const enumName = this.tokens.identifierNameAtRelativeIndex(1); | ||||
|     this.processEnum(); | ||||
|     if (this.isImportsTransformEnabled) { | ||||
|       this.tokens.appendCode(` exports.default = ${enumName};`); | ||||
|     } else { | ||||
|       this.tokens.appendCode(` export default ${enumName};`); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transpile flow enums to invoke the "flow-enums-runtime" library. | ||||
|    * | ||||
|    * Currently, the transpiled code always uses `require("flow-enums-runtime")`, | ||||
|    * but if future flexibility is needed, we could expose a config option for | ||||
|    * this string (similar to configurable JSX). Even when targeting ESM, the | ||||
|    * default behavior of babel-plugin-transform-flow-enums is to use require | ||||
|    * rather than injecting an import. | ||||
|    * | ||||
|    * Flow enums are quite a bit simpler than TS enums and have some convenient | ||||
|    * constraints: | ||||
|    * - Element initializers must be either always present or always absent. That | ||||
|    *   means that we can use fixed lookahead on the first element (if any) and | ||||
|    *   assume that all elements are like that. | ||||
|    * - The right-hand side of an element initializer must be a literal value, | ||||
|    *   not a complex expression and not referencing other elements. That means | ||||
|    *   we can simply copy a single token. | ||||
|    * | ||||
|    * Enums can be broken up into three basic cases: | ||||
|    * | ||||
|    * Mirrored enums: | ||||
|    * enum E {A, B} | ||||
|    *   -> | ||||
|    * const E = require("flow-enums-runtime").Mirrored(["A", "B"]); | ||||
|    * | ||||
|    * Initializer enums: | ||||
|    * enum E {A = 1, B = 2} | ||||
|    *   -> | ||||
|    * const E = require("flow-enums-runtime")({A: 1, B: 2}); | ||||
|    * | ||||
|    * Symbol enums: | ||||
|    * enum E of symbol {A, B} | ||||
|    *   -> | ||||
|    * const E = require("flow-enums-runtime")({A: Symbol("A"), B: Symbol("B")}); | ||||
|    * | ||||
|    * We can statically detect which of the three cases this is by looking at the | ||||
|    * "of" declaration (if any) and seeing if the first element has an initializer. | ||||
|    * Since the other transform details are so similar between the three cases, we | ||||
|    * use a single implementation and vary the transform within processEnumElement | ||||
|    * based on case. | ||||
|    */ | ||||
|   processEnum() { | ||||
|     // enum E -> const E
 | ||||
|     this.tokens.replaceToken("const"); | ||||
|     this.tokens.copyExpectedToken(_types.TokenType.name); | ||||
| 
 | ||||
|     let isSymbolEnum = false; | ||||
|     if (this.tokens.matchesContextual(_keywords.ContextualKeyword._of)) { | ||||
|       this.tokens.removeToken(); | ||||
|       isSymbolEnum = this.tokens.matchesContextual(_keywords.ContextualKeyword._symbol); | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|     const hasInitializers = this.tokens.matches3(_types.TokenType.braceL, _types.TokenType.name, _types.TokenType.eq); | ||||
|     this.tokens.appendCode(' = require("flow-enums-runtime")'); | ||||
| 
 | ||||
|     const isMirrored = !isSymbolEnum && !hasInitializers; | ||||
|     this.tokens.replaceTokenTrimmingLeftWhitespace(isMirrored ? ".Mirrored([" : "({"); | ||||
| 
 | ||||
|     while (!this.tokens.matches1(_types.TokenType.braceR)) { | ||||
|       // ... is allowed at the end and has no runtime behavior.
 | ||||
|       if (this.tokens.matches1(_types.TokenType.ellipsis)) { | ||||
|         this.tokens.removeToken(); | ||||
|         break; | ||||
|       } | ||||
|       this.processEnumElement(isSymbolEnum, hasInitializers); | ||||
|       if (this.tokens.matches1(_types.TokenType.comma)) { | ||||
|         this.tokens.copyToken(); | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     this.tokens.replaceToken(isMirrored ? "]);" : "});"); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Process an individual enum element, producing either an array element or an | ||||
|    * object element based on what type of enum this is. | ||||
|    */ | ||||
|   processEnumElement(isSymbolEnum, hasInitializers) { | ||||
|     if (isSymbolEnum) { | ||||
|       // Symbol enums never have initializers and are expanded to object elements.
 | ||||
|       // A, -> A: Symbol("A"),
 | ||||
|       const elementName = this.tokens.identifierName(); | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.appendCode(`: Symbol("${elementName}")`); | ||||
|     } else if (hasInitializers) { | ||||
|       // Initializers are expanded to object elements.
 | ||||
|       // A = 1, -> A: 1,
 | ||||
|       this.tokens.copyToken(); | ||||
|       this.tokens.replaceTokenTrimmingLeftWhitespace(":"); | ||||
|       this.tokens.copyToken(); | ||||
|     } else { | ||||
|       // Enum elements without initializers become string literal array elements.
 | ||||
|       // A, -> "A",
 | ||||
|       this.tokens.replaceToken(`"${this.tokens.identifierName()}"`); | ||||
|     } | ||||
|   } | ||||
| } exports.default = FlowTransformer; | ||||
							
								
								
									
										733
									
								
								node_modules/sucrase/dist/transformers/JSXTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										733
									
								
								node_modules/sucrase/dist/transformers/JSXTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,733 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; } | ||||
| 
 | ||||
| 
 | ||||
| var _xhtml = require('../parser/plugins/jsx/xhtml'); var _xhtml2 = _interopRequireDefault(_xhtml); | ||||
| var _tokenizer = require('../parser/tokenizer'); | ||||
| var _types = require('../parser/tokenizer/types'); | ||||
| var _charcodes = require('../parser/util/charcodes'); | ||||
| 
 | ||||
| var _getJSXPragmaInfo = require('../util/getJSXPragmaInfo'); var _getJSXPragmaInfo2 = _interopRequireDefault(_getJSXPragmaInfo); | ||||
| 
 | ||||
| var _Transformer = require('./Transformer'); var _Transformer2 = _interopRequireDefault(_Transformer); | ||||
| 
 | ||||
|  class JSXTransformer extends _Transformer2.default { | ||||
|    | ||||
|    | ||||
|    | ||||
| 
 | ||||
|   // State for calculating the line number of each JSX tag in development.
 | ||||
|   __init() {this.lastLineNumber = 1} | ||||
|   __init2() {this.lastIndex = 0} | ||||
| 
 | ||||
|   // In development, variable name holding the name of the current file.
 | ||||
|   __init3() {this.filenameVarName = null} | ||||
|   // Mapping of claimed names for imports in the automatic transform, e,g.
 | ||||
|   // {jsx: "_jsx"}. This determines which imports to generate in the prefix.
 | ||||
|   __init4() {this.esmAutomaticImportNameResolutions = {}} | ||||
|   // When automatically adding imports in CJS mode, we store the variable name
 | ||||
|   // holding the imported CJS module so we can require it in the prefix.
 | ||||
|   __init5() {this.cjsAutomaticModuleNameResolutions = {}} | ||||
| 
 | ||||
|   constructor( | ||||
|      rootTransformer, | ||||
|      tokens, | ||||
|      importProcessor, | ||||
|      nameManager, | ||||
|      options, | ||||
|   ) { | ||||
|     super();this.rootTransformer = rootTransformer;this.tokens = tokens;this.importProcessor = importProcessor;this.nameManager = nameManager;this.options = options;JSXTransformer.prototype.__init.call(this);JSXTransformer.prototype.__init2.call(this);JSXTransformer.prototype.__init3.call(this);JSXTransformer.prototype.__init4.call(this);JSXTransformer.prototype.__init5.call(this);; | ||||
|     this.jsxPragmaInfo = _getJSXPragmaInfo2.default.call(void 0, options); | ||||
|     this.isAutomaticRuntime = options.jsxRuntime === "automatic"; | ||||
|     this.jsxImportSource = options.jsxImportSource || "react"; | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     if (this.tokens.matches1(_types.TokenType.jsxTagStart)) { | ||||
|       this.processJSXTag(); | ||||
|       return true; | ||||
|     } | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   getPrefixCode() { | ||||
|     let prefix = ""; | ||||
|     if (this.filenameVarName) { | ||||
|       prefix += `const ${this.filenameVarName} = ${JSON.stringify(this.options.filePath || "")};`; | ||||
|     } | ||||
|     if (this.isAutomaticRuntime) { | ||||
|       if (this.importProcessor) { | ||||
|         // CJS mode: emit require statements for all modules that were referenced.
 | ||||
|         for (const [path, resolvedName] of Object.entries(this.cjsAutomaticModuleNameResolutions)) { | ||||
|           prefix += `var ${resolvedName} = require("${path}");`; | ||||
|         } | ||||
|       } else { | ||||
|         // ESM mode: consolidate and emit import statements for referenced names.
 | ||||
|         const {createElement: createElementResolution, ...otherResolutions} = | ||||
|           this.esmAutomaticImportNameResolutions; | ||||
|         if (createElementResolution) { | ||||
|           prefix += `import {createElement as ${createElementResolution}} from "${this.jsxImportSource}";`; | ||||
|         } | ||||
|         const importSpecifiers = Object.entries(otherResolutions) | ||||
|           .map(([name, resolvedName]) => `${name} as ${resolvedName}`) | ||||
|           .join(", "); | ||||
|         if (importSpecifiers) { | ||||
|           const importPath = | ||||
|             this.jsxImportSource + (this.options.production ? "/jsx-runtime" : "/jsx-dev-runtime"); | ||||
|           prefix += `import {${importSpecifiers}} from "${importPath}";`; | ||||
|         } | ||||
|       } | ||||
|     } | ||||
|     return prefix; | ||||
|   } | ||||
| 
 | ||||
|   processJSXTag() { | ||||
|     const {jsxRole, start} = this.tokens.currentToken(); | ||||
|     // Calculate line number information at the very start (if in development
 | ||||
|     // mode) so that the information is guaranteed to be queried in token order.
 | ||||
|     const elementLocationCode = this.options.production ? null : this.getElementLocationCode(start); | ||||
|     if (this.isAutomaticRuntime && jsxRole !== _tokenizer.JSXRole.KeyAfterPropSpread) { | ||||
|       this.transformTagToJSXFunc(elementLocationCode, jsxRole); | ||||
|     } else { | ||||
|       this.transformTagToCreateElement(elementLocationCode); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   getElementLocationCode(firstTokenStart) { | ||||
|     const lineNumber = this.getLineNumberForIndex(firstTokenStart); | ||||
|     return `lineNumber: ${lineNumber}`; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Get the line number for this source position. This is calculated lazily and | ||||
|    * must be called in increasing order by index. | ||||
|    */ | ||||
|   getLineNumberForIndex(index) { | ||||
|     const code = this.tokens.code; | ||||
|     while (this.lastIndex < index && this.lastIndex < code.length) { | ||||
|       if (code[this.lastIndex] === "\n") { | ||||
|         this.lastLineNumber++; | ||||
|       } | ||||
|       this.lastIndex++; | ||||
|     } | ||||
|     return this.lastLineNumber; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Convert the current JSX element to a call to jsx, jsxs, or jsxDEV. This is | ||||
|    * the primary transformation for the automatic transform. | ||||
|    * | ||||
|    * Example: | ||||
|    * <div a={1} key={2}>Hello{x}</div> | ||||
|    * becomes | ||||
|    * jsxs('div', {a: 1, children: ["Hello", x]}, 2) | ||||
|    */ | ||||
|   transformTagToJSXFunc(elementLocationCode, jsxRole) { | ||||
|     const isStatic = jsxRole === _tokenizer.JSXRole.StaticChildren; | ||||
|     // First tag is always jsxTagStart.
 | ||||
|     this.tokens.replaceToken(this.getJSXFuncInvocationCode(isStatic)); | ||||
| 
 | ||||
|     let keyCode = null; | ||||
|     if (this.tokens.matches1(_types.TokenType.jsxTagEnd)) { | ||||
|       // Fragment syntax.
 | ||||
|       this.tokens.replaceToken(`${this.getFragmentCode()}, {`); | ||||
|       this.processAutomaticChildrenAndEndProps(jsxRole); | ||||
|     } else { | ||||
|       // Normal open tag or self-closing tag.
 | ||||
|       this.processTagIntro(); | ||||
|       this.tokens.appendCode(", {"); | ||||
|       keyCode = this.processProps(true); | ||||
| 
 | ||||
|       if (this.tokens.matches2(_types.TokenType.slash, _types.TokenType.jsxTagEnd)) { | ||||
|         // Self-closing tag, no children to add, so close the props.
 | ||||
|         this.tokens.appendCode("}"); | ||||
|       } else if (this.tokens.matches1(_types.TokenType.jsxTagEnd)) { | ||||
|         // Tag with children.
 | ||||
|         this.tokens.removeToken(); | ||||
|         this.processAutomaticChildrenAndEndProps(jsxRole); | ||||
|       } else { | ||||
|         throw new Error("Expected either /> or > at the end of the tag."); | ||||
|       } | ||||
|       // If a key was present, move it to its own arg. Note that moving code
 | ||||
|       // like this will cause line numbers to get out of sync within the JSX
 | ||||
|       // element if the key expression has a newline in it. This is unfortunate,
 | ||||
|       // but hopefully should be rare.
 | ||||
|       if (keyCode) { | ||||
|         this.tokens.appendCode(`, ${keyCode}`); | ||||
|       } | ||||
|     } | ||||
|     if (!this.options.production) { | ||||
|       // If the key wasn't already added, add it now so we can correctly set
 | ||||
|       // positional args for jsxDEV.
 | ||||
|       if (keyCode === null) { | ||||
|         this.tokens.appendCode(", void 0"); | ||||
|       } | ||||
|       this.tokens.appendCode(`, ${isStatic}, ${this.getDevSource(elementLocationCode)}, this`); | ||||
|     } | ||||
|     // We're at the close-tag or the end of a self-closing tag, so remove
 | ||||
|     // everything else and close the function call.
 | ||||
|     this.tokens.removeInitialToken(); | ||||
|     while (!this.tokens.matches1(_types.TokenType.jsxTagEnd)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|     this.tokens.replaceToken(")"); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Convert the current JSX element to a createElement call. In the classic | ||||
|    * runtime, this is the only case. In the automatic runtime, this is called | ||||
|    * as a fallback in some situations. | ||||
|    * | ||||
|    * Example: | ||||
|    * <div a={1} key={2}>Hello{x}</div> | ||||
|    * becomes | ||||
|    * React.createElement('div', {a: 1, key: 2}, "Hello", x) | ||||
|    */ | ||||
|   transformTagToCreateElement(elementLocationCode) { | ||||
|     // First tag is always jsxTagStart.
 | ||||
|     this.tokens.replaceToken(this.getCreateElementInvocationCode()); | ||||
| 
 | ||||
|     if (this.tokens.matches1(_types.TokenType.jsxTagEnd)) { | ||||
|       // Fragment syntax.
 | ||||
|       this.tokens.replaceToken(`${this.getFragmentCode()}, null`); | ||||
|       this.processChildren(true); | ||||
|     } else { | ||||
|       // Normal open tag or self-closing tag.
 | ||||
|       this.processTagIntro(); | ||||
|       this.processPropsObjectWithDevInfo(elementLocationCode); | ||||
| 
 | ||||
|       if (this.tokens.matches2(_types.TokenType.slash, _types.TokenType.jsxTagEnd)) { | ||||
|         // Self-closing tag; no children to process.
 | ||||
|       } else if (this.tokens.matches1(_types.TokenType.jsxTagEnd)) { | ||||
|         // Tag with children and a close-tag; process the children as args.
 | ||||
|         this.tokens.removeToken(); | ||||
|         this.processChildren(true); | ||||
|       } else { | ||||
|         throw new Error("Expected either /> or > at the end of the tag."); | ||||
|       } | ||||
|     } | ||||
|     // We're at the close-tag or the end of a self-closing tag, so remove
 | ||||
|     // everything else and close the function call.
 | ||||
|     this.tokens.removeInitialToken(); | ||||
|     while (!this.tokens.matches1(_types.TokenType.jsxTagEnd)) { | ||||
|       this.tokens.removeToken(); | ||||
|     } | ||||
|     this.tokens.replaceToken(")"); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Get the code for the relevant function for this context: jsx, jsxs, | ||||
|    * or jsxDEV. The following open-paren is included as well. | ||||
|    * | ||||
|    * These functions are only used for the automatic runtime, so they are always | ||||
|    * auto-imported, but the auto-import will be either CJS or ESM based on the | ||||
|    * target module format. | ||||
|    */ | ||||
|   getJSXFuncInvocationCode(isStatic) { | ||||
|     if (this.options.production) { | ||||
|       if (isStatic) { | ||||
|         return this.claimAutoImportedFuncInvocation("jsxs", "/jsx-runtime"); | ||||
|       } else { | ||||
|         return this.claimAutoImportedFuncInvocation("jsx", "/jsx-runtime"); | ||||
|       } | ||||
|     } else { | ||||
|       return this.claimAutoImportedFuncInvocation("jsxDEV", "/jsx-dev-runtime"); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Return the code to use for the createElement function, e.g. | ||||
|    * `React.createElement`, including the following open-paren. | ||||
|    * | ||||
|    * This is the main function to use for the classic runtime. For the | ||||
|    * automatic runtime, this function is used as a fallback function to | ||||
|    * preserve behavior when there is a prop spread followed by an explicit | ||||
|    * key. In that automatic runtime case, the function should be automatically | ||||
|    * imported. | ||||
|    */ | ||||
|   getCreateElementInvocationCode() { | ||||
|     if (this.isAutomaticRuntime) { | ||||
|       return this.claimAutoImportedFuncInvocation("createElement", ""); | ||||
|     } else { | ||||
|       const {jsxPragmaInfo} = this; | ||||
|       const resolvedPragmaBaseName = this.importProcessor | ||||
|         ? this.importProcessor.getIdentifierReplacement(jsxPragmaInfo.base) || jsxPragmaInfo.base | ||||
|         : jsxPragmaInfo.base; | ||||
|       return `${resolvedPragmaBaseName}${jsxPragmaInfo.suffix}(`; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Return the code to use as the component when compiling a shorthand | ||||
|    * fragment, e.g. `React.Fragment`. | ||||
|    * | ||||
|    * This may be called from either the classic or automatic runtime, and | ||||
|    * the value should be auto-imported for the automatic runtime. | ||||
|    */ | ||||
|   getFragmentCode() { | ||||
|     if (this.isAutomaticRuntime) { | ||||
|       return this.claimAutoImportedName( | ||||
|         "Fragment", | ||||
|         this.options.production ? "/jsx-runtime" : "/jsx-dev-runtime", | ||||
|       ); | ||||
|     } else { | ||||
|       const {jsxPragmaInfo} = this; | ||||
|       const resolvedFragmentPragmaBaseName = this.importProcessor | ||||
|         ? this.importProcessor.getIdentifierReplacement(jsxPragmaInfo.fragmentBase) || | ||||
|           jsxPragmaInfo.fragmentBase | ||||
|         : jsxPragmaInfo.fragmentBase; | ||||
|       return resolvedFragmentPragmaBaseName + jsxPragmaInfo.fragmentSuffix; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Return code that invokes the given function. | ||||
|    * | ||||
|    * When the imports transform is enabled, use the CJSImportTransformer | ||||
|    * strategy of using `.call(void 0, ...` to avoid passing a `this` value in a | ||||
|    * situation that would otherwise look like a method call. | ||||
|    */ | ||||
|   claimAutoImportedFuncInvocation(funcName, importPathSuffix) { | ||||
|     const funcCode = this.claimAutoImportedName(funcName, importPathSuffix); | ||||
|     if (this.importProcessor) { | ||||
|       return `${funcCode}.call(void 0, `; | ||||
|     } else { | ||||
|       return `${funcCode}(`; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   claimAutoImportedName(funcName, importPathSuffix) { | ||||
|     if (this.importProcessor) { | ||||
|       // CJS mode: claim a name for the module and mark it for import.
 | ||||
|       const path = this.jsxImportSource + importPathSuffix; | ||||
|       if (!this.cjsAutomaticModuleNameResolutions[path]) { | ||||
|         this.cjsAutomaticModuleNameResolutions[path] = | ||||
|           this.importProcessor.getFreeIdentifierForPath(path); | ||||
|       } | ||||
|       return `${this.cjsAutomaticModuleNameResolutions[path]}.${funcName}`; | ||||
|     } else { | ||||
|       // ESM mode: claim a name for this function and add it to the names that
 | ||||
|       // should be auto-imported when the prefix is generated.
 | ||||
|       if (!this.esmAutomaticImportNameResolutions[funcName]) { | ||||
|         this.esmAutomaticImportNameResolutions[funcName] = this.nameManager.claimFreeName( | ||||
|           `_${funcName}`, | ||||
|         ); | ||||
|       } | ||||
|       return this.esmAutomaticImportNameResolutions[funcName]; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Process the first part of a tag, before any props. | ||||
|    */ | ||||
|   processTagIntro() { | ||||
|     // Walk forward until we see one of these patterns:
 | ||||
|     // jsxName to start the first prop, preceded by another jsxName to end the tag name.
 | ||||
|     // jsxName to start the first prop, preceded by greaterThan to end the type argument.
 | ||||
|     // [open brace] to start the first prop.
 | ||||
|     // [jsxTagEnd] to end the open-tag.
 | ||||
|     // [slash, jsxTagEnd] to end the self-closing tag.
 | ||||
|     let introEnd = this.tokens.currentIndex() + 1; | ||||
|     while ( | ||||
|       this.tokens.tokens[introEnd].isType || | ||||
|       (!this.tokens.matches2AtIndex(introEnd - 1, _types.TokenType.jsxName, _types.TokenType.jsxName) && | ||||
|         !this.tokens.matches2AtIndex(introEnd - 1, _types.TokenType.greaterThan, _types.TokenType.jsxName) && | ||||
|         !this.tokens.matches1AtIndex(introEnd, _types.TokenType.braceL) && | ||||
|         !this.tokens.matches1AtIndex(introEnd, _types.TokenType.jsxTagEnd) && | ||||
|         !this.tokens.matches2AtIndex(introEnd, _types.TokenType.slash, _types.TokenType.jsxTagEnd)) | ||||
|     ) { | ||||
|       introEnd++; | ||||
|     } | ||||
|     if (introEnd === this.tokens.currentIndex() + 1) { | ||||
|       const tagName = this.tokens.identifierName(); | ||||
|       if (startsWithLowerCase(tagName)) { | ||||
|         this.tokens.replaceToken(`'${tagName}'`); | ||||
|       } | ||||
|     } | ||||
|     while (this.tokens.currentIndex() < introEnd) { | ||||
|       this.rootTransformer.processToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Starting at the beginning of the props, add the props argument to | ||||
|    * React.createElement, including the comma before it. | ||||
|    */ | ||||
|   processPropsObjectWithDevInfo(elementLocationCode) { | ||||
|     const devProps = this.options.production | ||||
|       ? "" | ||||
|       : `__self: this, __source: ${this.getDevSource(elementLocationCode)}`; | ||||
|     if (!this.tokens.matches1(_types.TokenType.jsxName) && !this.tokens.matches1(_types.TokenType.braceL)) { | ||||
|       if (devProps) { | ||||
|         this.tokens.appendCode(`, {${devProps}}`); | ||||
|       } else { | ||||
|         this.tokens.appendCode(`, null`); | ||||
|       } | ||||
|       return; | ||||
|     } | ||||
|     this.tokens.appendCode(`, {`); | ||||
|     this.processProps(false); | ||||
|     if (devProps) { | ||||
|       this.tokens.appendCode(` ${devProps}}`); | ||||
|     } else { | ||||
|       this.tokens.appendCode("}"); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform the core part of the props, assuming that a { has already been | ||||
|    * inserted before us and that a } will be inserted after us. | ||||
|    * | ||||
|    * If extractKeyCode is true (i.e. when using any jsx... function), any prop | ||||
|    * named "key" has its code captured and returned rather than being emitted to | ||||
|    * the output code. This shifts line numbers, and emitting the code later will | ||||
|    * correct line numbers again. If no key is found or if extractKeyCode is | ||||
|    * false, this function returns null. | ||||
|    */ | ||||
|   processProps(extractKeyCode) { | ||||
|     let keyCode = null; | ||||
|     while (true) { | ||||
|       if (this.tokens.matches2(_types.TokenType.jsxName, _types.TokenType.eq)) { | ||||
|         // This is a regular key={value} or key="value" prop.
 | ||||
|         const propName = this.tokens.identifierName(); | ||||
|         if (extractKeyCode && propName === "key") { | ||||
|           if (keyCode !== null) { | ||||
|             // The props list has multiple keys. Different implementations are
 | ||||
|             // inconsistent about what to do here: as of this writing, Babel and
 | ||||
|             // swc keep the *last* key and completely remove the rest, while
 | ||||
|             // TypeScript uses the *first* key and leaves the others as regular
 | ||||
|             // props. The React team collaborated with Babel on the
 | ||||
|             // implementation of this behavior, so presumably the Babel behavior
 | ||||
|             // is the one to use.
 | ||||
|             // Since we won't ever be emitting the previous key code, we need to
 | ||||
|             // at least emit its newlines here so that the line numbers match up
 | ||||
|             // in the long run.
 | ||||
|             this.tokens.appendCode(keyCode.replace(/[^\n]/g, "")); | ||||
|           } | ||||
|           // key
 | ||||
|           this.tokens.removeToken(); | ||||
|           // =
 | ||||
|           this.tokens.removeToken(); | ||||
|           const snapshot = this.tokens.snapshot(); | ||||
|           this.processPropValue(); | ||||
|           keyCode = this.tokens.dangerouslyGetAndRemoveCodeSinceSnapshot(snapshot); | ||||
|           // Don't add a comma
 | ||||
|           continue; | ||||
|         } else { | ||||
|           this.processPropName(propName); | ||||
|           this.tokens.replaceToken(": "); | ||||
|           this.processPropValue(); | ||||
|         } | ||||
|       } else if (this.tokens.matches1(_types.TokenType.jsxName)) { | ||||
|         // This is a shorthand prop like <input disabled />.
 | ||||
|         const propName = this.tokens.identifierName(); | ||||
|         this.processPropName(propName); | ||||
|         this.tokens.appendCode(": true"); | ||||
|       } else if (this.tokens.matches1(_types.TokenType.braceL)) { | ||||
|         // This is prop spread, like <div {...getProps()}>, which we can pass
 | ||||
|         // through fairly directly as an object spread.
 | ||||
|         this.tokens.replaceToken(""); | ||||
|         this.rootTransformer.processBalancedCode(); | ||||
|         this.tokens.replaceToken(""); | ||||
|       } else { | ||||
|         break; | ||||
|       } | ||||
|       this.tokens.appendCode(","); | ||||
|     } | ||||
|     return keyCode; | ||||
|   } | ||||
| 
 | ||||
|   processPropName(propName) { | ||||
|     if (propName.includes("-")) { | ||||
|       this.tokens.replaceToken(`'${propName}'`); | ||||
|     } else { | ||||
|       this.tokens.copyToken(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   processPropValue() { | ||||
|     if (this.tokens.matches1(_types.TokenType.braceL)) { | ||||
|       this.tokens.replaceToken(""); | ||||
|       this.rootTransformer.processBalancedCode(); | ||||
|       this.tokens.replaceToken(""); | ||||
|     } else if (this.tokens.matches1(_types.TokenType.jsxTagStart)) { | ||||
|       this.processJSXTag(); | ||||
|     } else { | ||||
|       this.processStringPropValue(); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   processStringPropValue() { | ||||
|     const token = this.tokens.currentToken(); | ||||
|     const valueCode = this.tokens.code.slice(token.start + 1, token.end - 1); | ||||
|     const replacementCode = formatJSXTextReplacement(valueCode); | ||||
|     const literalCode = formatJSXStringValueLiteral(valueCode); | ||||
|     this.tokens.replaceToken(literalCode + replacementCode); | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Starting in the middle of the props object literal, produce an additional | ||||
|    * prop for the children and close the object literal. | ||||
|    */ | ||||
|   processAutomaticChildrenAndEndProps(jsxRole) { | ||||
|     if (jsxRole === _tokenizer.JSXRole.StaticChildren) { | ||||
|       this.tokens.appendCode(" children: ["); | ||||
|       this.processChildren(false); | ||||
|       this.tokens.appendCode("]}"); | ||||
|     } else { | ||||
|       // The parser information tells us whether we will see a real child or if
 | ||||
|       // all remaining children (if any) will resolve to empty. If there are no
 | ||||
|       // non-empty children, don't emit a children prop at all, but still
 | ||||
|       // process children so that we properly transform the code into nothing.
 | ||||
|       if (jsxRole === _tokenizer.JSXRole.OneChild) { | ||||
|         this.tokens.appendCode(" children: "); | ||||
|       } | ||||
|       this.processChildren(false); | ||||
|       this.tokens.appendCode("}"); | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Transform children into a comma-separated list, which will be either | ||||
|    * arguments to createElement or array elements of a children prop. | ||||
|    */ | ||||
|   processChildren(needsInitialComma) { | ||||
|     let needsComma = needsInitialComma; | ||||
|     while (true) { | ||||
|       if (this.tokens.matches2(_types.TokenType.jsxTagStart, _types.TokenType.slash)) { | ||||
|         // Closing tag, so no more children.
 | ||||
|         return; | ||||
|       } | ||||
|       let didEmitElement = false; | ||||
|       if (this.tokens.matches1(_types.TokenType.braceL)) { | ||||
|         if (this.tokens.matches2(_types.TokenType.braceL, _types.TokenType.braceR)) { | ||||
|           // Empty interpolations and comment-only interpolations are allowed
 | ||||
|           // and don't create an extra child arg.
 | ||||
|           this.tokens.replaceToken(""); | ||||
|           this.tokens.replaceToken(""); | ||||
|         } else { | ||||
|           // Interpolated expression.
 | ||||
|           this.tokens.replaceToken(needsComma ? ", " : ""); | ||||
|           this.rootTransformer.processBalancedCode(); | ||||
|           this.tokens.replaceToken(""); | ||||
|           didEmitElement = true; | ||||
|         } | ||||
|       } else if (this.tokens.matches1(_types.TokenType.jsxTagStart)) { | ||||
|         // Child JSX element
 | ||||
|         this.tokens.appendCode(needsComma ? ", " : ""); | ||||
|         this.processJSXTag(); | ||||
|         didEmitElement = true; | ||||
|       } else if (this.tokens.matches1(_types.TokenType.jsxText) || this.tokens.matches1(_types.TokenType.jsxEmptyText)) { | ||||
|         didEmitElement = this.processChildTextElement(needsComma); | ||||
|       } else { | ||||
|         throw new Error("Unexpected token when processing JSX children."); | ||||
|       } | ||||
|       if (didEmitElement) { | ||||
|         needsComma = true; | ||||
|       } | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Turn a JSX text element into a string literal, or nothing at all if the JSX | ||||
|    * text resolves to the empty string. | ||||
|    * | ||||
|    * Returns true if a string literal is emitted, false otherwise. | ||||
|    */ | ||||
|   processChildTextElement(needsComma) { | ||||
|     const token = this.tokens.currentToken(); | ||||
|     const valueCode = this.tokens.code.slice(token.start, token.end); | ||||
|     const replacementCode = formatJSXTextReplacement(valueCode); | ||||
|     const literalCode = formatJSXTextLiteral(valueCode); | ||||
|     if (literalCode === '""') { | ||||
|       this.tokens.replaceToken(replacementCode); | ||||
|       return false; | ||||
|     } else { | ||||
|       this.tokens.replaceToken(`${needsComma ? ", " : ""}${literalCode}${replacementCode}`); | ||||
|       return true; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   getDevSource(elementLocationCode) { | ||||
|     return `{fileName: ${this.getFilenameVarName()}, ${elementLocationCode}}`; | ||||
|   } | ||||
| 
 | ||||
|   getFilenameVarName() { | ||||
|     if (!this.filenameVarName) { | ||||
|       this.filenameVarName = this.nameManager.claimFreeName("_jsxFileName"); | ||||
|     } | ||||
|     return this.filenameVarName; | ||||
|   } | ||||
| } exports.default = JSXTransformer; | ||||
| 
 | ||||
| /** | ||||
|  * Spec for identifiers: https://tc39.github.io/ecma262/#prod-IdentifierStart.
 | ||||
|  * | ||||
|  * Really only treat anything starting with a-z as tag names.  `_`, `$`, `é` | ||||
|  * should be treated as component names | ||||
|  */ | ||||
|  function startsWithLowerCase(s) { | ||||
|   const firstChar = s.charCodeAt(0); | ||||
|   return firstChar >= _charcodes.charCodes.lowercaseA && firstChar <= _charcodes.charCodes.lowercaseZ; | ||||
| } exports.startsWithLowerCase = startsWithLowerCase; | ||||
| 
 | ||||
| /** | ||||
|  * Turn the given jsxText string into a JS string literal. Leading and trailing | ||||
|  * whitespace on lines is removed, except immediately after the open-tag and | ||||
|  * before the close-tag. Empty lines are completely removed, and spaces are | ||||
|  * added between lines after that. | ||||
|  * | ||||
|  * We use JSON.stringify to introduce escape characters as necessary, and trim | ||||
|  * the start and end of each line and remove blank lines. | ||||
|  */ | ||||
| function formatJSXTextLiteral(text) { | ||||
|   let result = ""; | ||||
|   let whitespace = ""; | ||||
| 
 | ||||
|   let isInInitialLineWhitespace = false; | ||||
|   let seenNonWhitespace = false; | ||||
|   for (let i = 0; i < text.length; i++) { | ||||
|     const c = text[i]; | ||||
|     if (c === " " || c === "\t" || c === "\r") { | ||||
|       if (!isInInitialLineWhitespace) { | ||||
|         whitespace += c; | ||||
|       } | ||||
|     } else if (c === "\n") { | ||||
|       whitespace = ""; | ||||
|       isInInitialLineWhitespace = true; | ||||
|     } else { | ||||
|       if (seenNonWhitespace && isInInitialLineWhitespace) { | ||||
|         result += " "; | ||||
|       } | ||||
|       result += whitespace; | ||||
|       whitespace = ""; | ||||
|       if (c === "&") { | ||||
|         const {entity, newI} = processEntity(text, i + 1); | ||||
|         i = newI - 1; | ||||
|         result += entity; | ||||
|       } else { | ||||
|         result += c; | ||||
|       } | ||||
|       seenNonWhitespace = true; | ||||
|       isInInitialLineWhitespace = false; | ||||
|     } | ||||
|   } | ||||
|   if (!isInInitialLineWhitespace) { | ||||
|     result += whitespace; | ||||
|   } | ||||
|   return JSON.stringify(result); | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Produce the code that should be printed after the JSX text string literal, | ||||
|  * with most content removed, but all newlines preserved and all spacing at the | ||||
|  * end preserved. | ||||
|  */ | ||||
| function formatJSXTextReplacement(text) { | ||||
|   let numNewlines = 0; | ||||
|   let numSpaces = 0; | ||||
|   for (const c of text) { | ||||
|     if (c === "\n") { | ||||
|       numNewlines++; | ||||
|       numSpaces = 0; | ||||
|     } else if (c === " ") { | ||||
|       numSpaces++; | ||||
|     } | ||||
|   } | ||||
|   return "\n".repeat(numNewlines) + " ".repeat(numSpaces); | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Format a string in the value position of a JSX prop. | ||||
|  * | ||||
|  * Use the same implementation as convertAttribute from | ||||
|  * babel-helper-builder-react-jsx. | ||||
|  */ | ||||
| function formatJSXStringValueLiteral(text) { | ||||
|   let result = ""; | ||||
|   for (let i = 0; i < text.length; i++) { | ||||
|     const c = text[i]; | ||||
|     if (c === "\n") { | ||||
|       if (/\s/.test(text[i + 1])) { | ||||
|         result += " "; | ||||
|         while (i < text.length && /\s/.test(text[i + 1])) { | ||||
|           i++; | ||||
|         } | ||||
|       } else { | ||||
|         result += "\n"; | ||||
|       } | ||||
|     } else if (c === "&") { | ||||
|       const {entity, newI} = processEntity(text, i + 1); | ||||
|       result += entity; | ||||
|       i = newI - 1; | ||||
|     } else { | ||||
|       result += c; | ||||
|     } | ||||
|   } | ||||
|   return JSON.stringify(result); | ||||
| } | ||||
| 
 | ||||
| /** | ||||
|  * Starting at a &, see if there's an HTML entity (specified by name, decimal | ||||
|  * char code, or hex char code) and return it if so. | ||||
|  * | ||||
|  * Modified from jsxReadString in babel-parser. | ||||
|  */ | ||||
| function processEntity(text, indexAfterAmpersand) { | ||||
|   let str = ""; | ||||
|   let count = 0; | ||||
|   let entity; | ||||
|   let i = indexAfterAmpersand; | ||||
| 
 | ||||
|   if (text[i] === "#") { | ||||
|     let radix = 10; | ||||
|     i++; | ||||
|     let numStart; | ||||
|     if (text[i] === "x") { | ||||
|       radix = 16; | ||||
|       i++; | ||||
|       numStart = i; | ||||
|       while (i < text.length && isHexDigit(text.charCodeAt(i))) { | ||||
|         i++; | ||||
|       } | ||||
|     } else { | ||||
|       numStart = i; | ||||
|       while (i < text.length && isDecimalDigit(text.charCodeAt(i))) { | ||||
|         i++; | ||||
|       } | ||||
|     } | ||||
|     if (text[i] === ";") { | ||||
|       const numStr = text.slice(numStart, i); | ||||
|       if (numStr) { | ||||
|         i++; | ||||
|         entity = String.fromCodePoint(parseInt(numStr, radix)); | ||||
|       } | ||||
|     } | ||||
|   } else { | ||||
|     while (i < text.length && count++ < 10) { | ||||
|       const ch = text[i]; | ||||
|       i++; | ||||
|       if (ch === ";") { | ||||
|         entity = _xhtml2.default.get(str); | ||||
|         break; | ||||
|       } | ||||
|       str += ch; | ||||
|     } | ||||
|   } | ||||
| 
 | ||||
|   if (!entity) { | ||||
|     return {entity: "&", newI: indexAfterAmpersand}; | ||||
|   } | ||||
|   return {entity, newI: i}; | ||||
| } | ||||
| 
 | ||||
| function isDecimalDigit(code) { | ||||
|   return code >= _charcodes.charCodes.digit0 && code <= _charcodes.charCodes.digit9; | ||||
| } | ||||
| 
 | ||||
| function isHexDigit(code) { | ||||
|   return ( | ||||
|     (code >= _charcodes.charCodes.digit0 && code <= _charcodes.charCodes.digit9) || | ||||
|     (code >= _charcodes.charCodes.lowercaseA && code <= _charcodes.charCodes.lowercaseF) || | ||||
|     (code >= _charcodes.charCodes.uppercaseA && code <= _charcodes.charCodes.uppercaseF) | ||||
|   ); | ||||
| } | ||||
							
								
								
									
										111
									
								
								node_modules/sucrase/dist/transformers/JestHoistTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							
							
						
						
									
										111
									
								
								node_modules/sucrase/dist/transformers/JestHoistTransformer.js
									
										
									
										generated
									
									
										vendored
									
									
										Normal file
									
								
							|  | @ -0,0 +1,111 @@ | |||
| "use strict";Object.defineProperty(exports, "__esModule", {value: true}); function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { default: obj }; } function _optionalChain(ops) { let lastAccessLHS = undefined; let value = ops[0]; let i = 1; while (i < ops.length) { const op = ops[i]; const fn = ops[i + 1]; i += 2; if ((op === 'optionalAccess' || op === 'optionalCall') && value == null) { return undefined; } if (op === 'access' || op === 'optionalAccess') { lastAccessLHS = value; value = fn(value); } else if (op === 'call' || op === 'optionalCall') { value = fn((...args) => value.call(lastAccessLHS, ...args)); lastAccessLHS = undefined; } } return value; } | ||||
| 
 | ||||
| var _types = require('../parser/tokenizer/types'); | ||||
| 
 | ||||
| 
 | ||||
| var _Transformer = require('./Transformer'); var _Transformer2 = _interopRequireDefault(_Transformer); | ||||
| 
 | ||||
| const JEST_GLOBAL_NAME = "jest"; | ||||
| const HOISTED_METHODS = ["mock", "unmock", "enableAutomock", "disableAutomock"]; | ||||
| 
 | ||||
| /** | ||||
|  * Implementation of babel-plugin-jest-hoist, which hoists up some jest method | ||||
|  * calls above the imports to allow them to override other imports. | ||||
|  * | ||||
|  * To preserve line numbers, rather than directly moving the jest.mock code, we | ||||
|  * wrap each invocation in a function statement and then call the function from | ||||
|  * the top of the file. | ||||
|  */ | ||||
|  class JestHoistTransformer extends _Transformer2.default { | ||||
|     __init() {this.hoistedFunctionNames = []} | ||||
| 
 | ||||
|   constructor( | ||||
|      rootTransformer, | ||||
|      tokens, | ||||
|      nameManager, | ||||
|      importProcessor, | ||||
|   ) { | ||||
|     super();this.rootTransformer = rootTransformer;this.tokens = tokens;this.nameManager = nameManager;this.importProcessor = importProcessor;JestHoistTransformer.prototype.__init.call(this);; | ||||
|   } | ||||
| 
 | ||||
|   process() { | ||||
|     if ( | ||||
|       this.tokens.currentToken().scopeDepth === 0 && | ||||
|       this.tokens.matches4(_types.TokenType.name, _types.TokenType.dot, _types.TokenType.name, _types.TokenType.parenL) && | ||||
|       this.tokens.identifierName() === JEST_GLOBAL_NAME | ||||
|     ) { | ||||
|       // TODO: This only works if imports transform is active, which it will be for jest.
 | ||||
|       //       But if jest adds module support and we no longer need the import transform, this needs fixing.
 | ||||
|       if (_optionalChain([this, 'access', _ => _.importProcessor, 'optionalAccess', _2 => _2.getGlobalNames, 'call', _3 => _3(), 'optionalAccess', _4 => _4.has, 'call', _5 => _5(JEST_GLOBAL_NAME)])) { | ||||
|         return false; | ||||
|       } | ||||
|       return this.extractHoistedCalls(); | ||||
|     } | ||||
| 
 | ||||
|     return false; | ||||
|   } | ||||
| 
 | ||||
|   getHoistedCode() { | ||||
|     if (this.hoistedFunctionNames.length > 0) { | ||||
|       // This will be placed before module interop code, but that's fine since
 | ||||
|       // imports aren't allowed in module mock factories.
 | ||||
|       return this.hoistedFunctionNames.map((name) => `${name}();`).join(""); | ||||
|     } | ||||
|     return ""; | ||||
|   } | ||||
| 
 | ||||
|   /** | ||||
|    * Extracts any methods calls on the jest-object that should be hoisted. | ||||
|    * | ||||
|    * According to the jest docs, https://jestjs.io/docs/en/jest-object#jestmockmodulename-factory-options,
 | ||||
|    * mock, unmock, enableAutomock, disableAutomock, are the methods that should be hoisted. | ||||
|    * | ||||
|    * We do not apply the same checks of the arguments as babel-plugin-jest-hoist does. | ||||
|    */ | ||||
|    extractHoistedCalls() { | ||||
|     // We're handling a chain of calls where `jest` may or may not need to be inserted for each call
 | ||||
|     // in the chain, so remove the initial `jest` to make the loop implementation cleaner.
 | ||||
|     this.tokens.removeToken(); | ||||
|     // Track some state so that multiple non-hoisted chained calls in a row keep their chaining
 | ||||
|     // syntax.
 | ||||
|     let followsNonHoistedJestCall = false; | ||||
| 
 | ||||
|     // Iterate through all chained calls on the jest object.
 | ||||
|     while (this.tokens.matches3(_types.TokenType.dot, _types.TokenType.name, _types.TokenType.parenL)) { | ||||
|       const methodName = this.tokens.identifierNameAtIndex(this.tokens.currentIndex() + 1); | ||||
|       const shouldHoist = HOISTED_METHODS.includes(methodName); | ||||
|       if (shouldHoist) { | ||||
|         // We've matched e.g. `.mock(...)` or similar call.
 | ||||
|         // Replace the initial `.` with `function __jestHoist(){jest.`
 | ||||
|         const hoistedFunctionName = this.nameManager.claimFreeName("__jestHoist"); | ||||
|         this.hoistedFunctionNames.push(hoistedFunctionName); | ||||
|         this.tokens.replaceToken(`function ${hoistedFunctionName}(){${JEST_GLOBAL_NAME}.`); | ||||
|         this.tokens.copyToken(); | ||||
|         this.tokens.copyToken(); | ||||
|         this.rootTransformer.processBalancedCode(); | ||||
|         this.tokens.copyExpectedToken(_types.TokenType.parenR); | ||||
|         this.tokens.appendCode(";}"); | ||||
|         followsNonHoistedJestCall = false; | ||||
|       } else { | ||||
|         // This is a non-hoisted method, so just transform the code as usual.
 | ||||
|         if (followsNonHoistedJestCall) { | ||||
|           // If we didn't hoist the previous call, we can leave the code as-is to chain off of the
 | ||||
|           // previous method call. It's important to preserve the code here because we don't know
 | ||||
|           // for sure that the method actually returned the jest object for chaining.
 | ||||
|           this.tokens.copyToken(); | ||||
|         } else { | ||||
|           // If we hoisted the previous call, we know it returns the jest object back, so we insert
 | ||||
|           // the identifier `jest` to continue the chain.
 | ||||
|           this.tokens.replaceToken(`${JEST_GLOBAL_NAME}.`); | ||||
|         } | ||||
|         this.tokens.copyToken(); | ||||
|         this.tokens.copyToken(); | ||||
|         this.rootTransformer.processBalancedCode(); | ||||
|         this.tokens.copyExpectedToken(_types.TokenType.parenR); | ||||
|         followsNonHoistedJestCall = true; | ||||
|       } | ||||
|     } | ||||
| 
 | ||||
|     return true; | ||||
|   } | ||||
| } exports.default = JestHoistTransformer; | ||||
Some files were not shown because too many files have changed in this diff Show more
		Loading…
	
	Add table
		Add a link
		
	
		Reference in a new issue
	
	 sindrekjelsrud
						sindrekjelsrud