Template Upload

This commit is contained in:
SOUTHERNCO\x2mjbyrn
2017-05-17 13:45:25 -04:00
parent 415b9c25f3
commit 7efe7605b8
11476 changed files with 2170865 additions and 34 deletions

15
node_modules/readdirp/.npmignore generated vendored Normal file
View File

@ -0,0 +1,15 @@
lib-cov
*.seed
*.log
*.csv
*.dat
*.out
*.pid
*.gz
pids
logs
results
node_modules
npm-debug.log

6
node_modules/readdirp/.travis.yml generated vendored Normal file
View File

@ -0,0 +1,6 @@
language: node_js
node_js:
- "0.10"
- "0.12"
- "4.4"
- "6.2"

20
node_modules/readdirp/LICENSE generated vendored Normal file
View File

@ -0,0 +1,20 @@
This software is released under the MIT license:
Copyright (c) 2012-2015 Thorsten Lorenz
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of
the Software, and to permit persons to whom the Software is furnished to do so,
subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR
COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER
IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN
CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

233
node_modules/readdirp/README.md generated vendored Normal file
View File

@ -0,0 +1,233 @@
# readdirp [![Build Status](https://secure.travis-ci.org/thlorenz/readdirp.png)](http://travis-ci.org/thlorenz/readdirp)
[![NPM](https://nodei.co/npm/readdirp.png?downloads=true&stars=true)](https://nodei.co/npm/readdirp/)
Recursive version of [fs.readdir](http://nodejs.org/docs/latest/api/fs.html#fs_fs_readdir_path_callback). Exposes a **stream api**.
```javascript
var readdirp = require('readdirp')
, path = require('path')
, es = require('event-stream');
// print out all JavaScript files along with their size
var stream = readdirp({ root: path.join(__dirname), fileFilter: '*.js' });
stream
.on('warn', function (err) {
console.error('non-fatal error', err);
// optionally call stream.destroy() here in order to abort and cause 'close' to be emitted
})
.on('error', function (err) { console.error('fatal error', err); })
.pipe(es.mapSync(function (entry) {
return { path: entry.path, size: entry.stat.size };
}))
.pipe(es.stringify())
.pipe(process.stdout);
```
Meant to be one of the recursive versions of [fs](http://nodejs.org/docs/latest/api/fs.html) functions, e.g., like [mkdirp](https://github.com/substack/node-mkdirp).
**Table of Contents** *generated with [DocToc](http://doctoc.herokuapp.com/)*
- [Installation](#installation)
- [API](#api)
- [entry stream](#entry-stream)
- [options](#options)
- [entry info](#entry-info)
- [Filters](#filters)
- [Callback API](#callback-api)
- [allProcessed ](#allprocessed)
- [fileProcessed](#fileprocessed)
- [More Examples](#more-examples)
- [stream api](#stream-api)
- [stream api pipe](#stream-api-pipe)
- [grep](#grep)
- [using callback api](#using-callback-api)
- [tests](#tests)
# Installation
npm install readdirp
# API
***var entryStream = readdirp (options)***
Reads given root recursively and returns a `stream` of [entry info](#entry-info)s.
## entry stream
Behaves as follows:
- `emit('data')` passes an [entry info](#entry-info) whenever one is found
- `emit('warn')` passes a non-fatal `Error` that prevents a file/directory from being processed (i.e., if it is
inaccessible to the user)
- `emit('error')` passes a fatal `Error` which also ends the stream (i.e., when illegal options where passed)
- `emit('end')` called when all entries were found and no more will be emitted (i.e., we are done)
- `emit('close')` called when the stream is destroyed via `stream.destroy()` (which could be useful if you want to
manually abort even on a non fatal error) - at that point the stream is no longer `readable` and no more entries,
warning or errors are emitted
- to learn more about streams, consult the very detailed
[nodejs streams documentation](http://nodejs.org/api/stream.html) or the
[stream-handbook](https://github.com/substack/stream-handbook)
## options
- **root**: path in which to start reading and recursing into subdirectories
- **fileFilter**: filter to include/exclude files found (see [Filters](#filters) for more)
- **directoryFilter**: filter to include/exclude directories found and to recurse into (see [Filters](#filters) for more)
- **depth**: depth at which to stop recursing even if more subdirectories are found
- **entryType**: determines if data events on the stream should be emitted for `'files'`, `'directories'`, `'both'`, or `'all'`. Setting to `'all'` will also include entries for other types of file descriptors like character devices, unix sockets and named pipes. Defaults to `'files'`.
- **lstat**: if `true`, readdirp uses `fs.lstat` instead of `fs.stat` in order to stat files and includes symlink entries in the stream along with files.
## entry info
Has the following properties:
- **parentDir** : directory in which entry was found (relative to given root)
- **fullParentDir** : full path to parent directory
- **name** : name of the file/directory
- **path** : path to the file/directory (relative to given root)
- **fullPath** : full path to the file/directory found
- **stat** : built in [stat object](http://nodejs.org/docs/v0.4.9/api/fs.html#fs.Stats)
- **Example**: (assuming root was `/User/dev/readdirp`)
parentDir : 'test/bed/root_dir1',
fullParentDir : '/User/dev/readdirp/test/bed/root_dir1',
name : 'root_dir1_subdir1',
path : 'test/bed/root_dir1/root_dir1_subdir1',
fullPath : '/User/dev/readdirp/test/bed/root_dir1/root_dir1_subdir1',
stat : [ ... ]
## Filters
There are three different ways to specify filters for files and directories respectively.
- **function**: a function that takes an entry info as a parameter and returns true to include or false to exclude the entry
- **glob string**: a string (e.g., `*.js`) which is matched using [minimatch](https://github.com/isaacs/minimatch), so go there for more
information.
Globstars (`**`) are not supported since specifiying a recursive pattern for an already recursive function doesn't make sense.
Negated globs (as explained in the minimatch documentation) are allowed, e.g., `!*.txt` matches everything but text files.
- **array of glob strings**: either need to be all inclusive or all exclusive (negated) patterns otherwise an error is thrown.
`[ '*.json', '*.js' ]` includes all JavaScript and Json files.
`[ '!.git', '!node_modules' ]` includes all directories except the '.git' and 'node_modules'.
Directories that do not pass a filter will not be recursed into.
## Callback API
Although the stream api is recommended, readdirp also exposes a callback based api.
***readdirp (options, callback1 [, callback2])***
If callback2 is given, callback1 functions as the **fileProcessed** callback, and callback2 as the **allProcessed** callback.
If only callback1 is given, it functions as the **allProcessed** callback.
### allProcessed
- function with err and res parameters, e.g., `function (err, res) { ... }`
- **err**: array of errors that occurred during the operation, **res may still be present, even if errors occurred**
- **res**: collection of file/directory [entry infos](#entry-info)
### fileProcessed
- function with [entry info](#entry-info) parameter e.g., `function (entryInfo) { ... }`
# More Examples
`on('error', ..)`, `on('warn', ..)` and `on('end', ..)` handling omitted for brevity
```javascript
var readdirp = require('readdirp');
// Glob file filter
readdirp({ root: './test/bed', fileFilter: '*.js' })
.on('data', function (entry) {
// do something with each JavaScript file entry
});
// Combined glob file filters
readdirp({ root: './test/bed', fileFilter: [ '*.js', '*.json' ] })
.on('data', function (entry) {
// do something with each JavaScript and Json file entry
});
// Combined negated directory filters
readdirp({ root: './test/bed', directoryFilter: [ '!.git', '!*modules' ] })
.on('data', function (entry) {
// do something with each file entry found outside '.git' or any modules directory
});
// Function directory filter
readdirp({ root: './test/bed', directoryFilter: function (di) { return di.name.length === 9; } })
.on('data', function (entry) {
// do something with each file entry found inside directories whose name has length 9
});
// Limiting depth
readdirp({ root: './test/bed', depth: 1 })
.on('data', function (entry) {
// do something with each file entry found up to 1 subdirectory deep
});
// callback api
readdirp(
{ root: '.' }
, function(fileInfo) {
// do something with file entry here
}
, function (err, res) {
// all done, move on or do final step for all file entries here
}
);
```
Try more examples by following [instructions](https://github.com/thlorenz/readdirp/blob/master/examples/Readme.md)
on how to get going.
## stream api
[stream-api.js](https://github.com/thlorenz/readdirp/blob/master/examples/stream-api.js)
Demonstrates error and data handling by listening to events emitted from the readdirp stream.
## stream api pipe
[stream-api-pipe.js](https://github.com/thlorenz/readdirp/blob/master/examples/stream-api-pipe.js)
Demonstrates error handling by listening to events emitted from the readdirp stream and how to pipe the data stream into
another destination stream.
## grep
[grep.js](https://github.com/thlorenz/readdirp/blob/master/examples/grep.js)
Very naive implementation of grep, for demonstration purposes only.
## using callback api
[callback-api.js](https://github.com/thlorenz/readdirp/blob/master/examples/callback-api.js)
Shows how to pass callbacks in order to handle errors and/or data.
## tests
The [readdirp tests](https://github.com/thlorenz/readdirp/blob/master/test/readdirp.js) also will give you a good idea on
how things work.

37
node_modules/readdirp/examples/Readme.md generated vendored Normal file
View File

@ -0,0 +1,37 @@
# readdirp examples
## How to run the examples
Assuming you installed readdirp (`npm install readdirp`), you can do the following:
1. `npm explore readdirp`
2. `cd examples`
3. `npm install`
At that point you can run the examples with node, i.e., `node grep`.
## stream api
[stream-api.js](https://github.com/thlorenz/readdirp/blob/master/examples/stream-api.js)
Demonstrates error and data handling by listening to events emitted from the readdirp stream.
## stream api pipe
[stream-api-pipe.js](https://github.com/thlorenz/readdirp/blob/master/examples/stream-api-pipe.js)
Demonstrates error handling by listening to events emitted from the readdirp stream and how to pipe the data stream into
another destination stream.
## grep
[grep.js](https://github.com/thlorenz/readdirp/blob/master/examples/grep.js)
Very naive implementation of grep, for demonstration purposes only.
## using callback api
[callback-api.js](https://github.com/thlorenz/readdirp/blob/master/examples/callback-api.js)
Shows how to pass callbacks in order to handle errors and/or data.

10
node_modules/readdirp/examples/callback-api.js generated vendored Normal file
View File

@ -0,0 +1,10 @@
var readdirp = require('..');
readdirp({ root: '.', fileFilter: '*.js' }, function (errors, res) {
if (errors) {
errors.forEach(function (err) {
console.error('Error: ', err);
});
}
console.log('all javascript files', res);
});

71
node_modules/readdirp/examples/grep.js generated vendored Normal file
View File

@ -0,0 +1,71 @@
'use strict';
var readdirp = require('..')
, util = require('util')
, fs = require('fs')
, path = require('path')
, es = require('event-stream')
;
function findLinesMatching (searchTerm) {
return es.through(function (entry) {
var lineno = 0
, matchingLines = []
, fileStream = this;
function filter () {
return es.mapSync(function (line) {
lineno++;
return ~line.indexOf(searchTerm) ? lineno + ': ' + line : undefined;
});
}
function aggregate () {
return es.through(
function write (data) {
matchingLines.push(data);
}
, function end () {
// drop files that had no matches
if (matchingLines.length) {
var result = { file: entry, lines: matchingLines };
// pass result on to file stream
fileStream.emit('data', result);
}
this.emit('end');
}
);
}
fs.createReadStream(entry.fullPath, { encoding: 'utf-8' })
// handle file contents line by line
.pipe(es.split('\n'))
// keep only the lines that matched the term
.pipe(filter())
// aggregate all matching lines and delegate control back to the file stream
.pipe(aggregate())
;
});
}
console.log('grepping for "arguments"');
// create a stream of all javascript files found in this and all sub directories
readdirp({ root: path.join(__dirname), fileFilter: '*.js' })
// find all lines matching the term for each file (if none found, that file is ignored)
.pipe(findLinesMatching('arguments'))
// format the results and output
.pipe(
es.mapSync(function (res) {
return '\n\n' + res.file.path + '\n\t' + res.lines.join('\n\t');
})
)
.pipe(process.stdout)
;

9
node_modules/readdirp/examples/package.json generated vendored Normal file
View File

@ -0,0 +1,9 @@
{
"name": "readdirp-examples",
"version": "0.0.0",
"description": "Examples for readdirp.",
"dependencies": {
"tap-stream": "~0.1.0",
"event-stream": "~3.0.7"
}
}

19
node_modules/readdirp/examples/stream-api-pipe.js generated vendored Normal file
View File

@ -0,0 +1,19 @@
var readdirp = require('..')
, path = require('path')
, through = require('through2')
// print out all JavaScript files along with their size
readdirp({ root: path.join(__dirname), fileFilter: '*.js' })
.on('warn', function (err) { console.error('non-fatal error', err); })
.on('error', function (err) { console.error('fatal error', err); })
.pipe(through.obj(function (entry, _, cb) {
this.push({ path: entry.path, size: entry.stat.size });
cb();
}))
.pipe(through.obj(
function (res, _, cb) {
this.push(JSON.stringify(res) + '\n');
cb();
})
)
.pipe(process.stdout);

15
node_modules/readdirp/examples/stream-api.js generated vendored Normal file
View File

@ -0,0 +1,15 @@
var readdirp = require('..')
, path = require('path');
readdirp({ root: path.join(__dirname), fileFilter: '*.js' })
.on('warn', function (err) {
console.error('something went wrong when processing an entry', err);
})
.on('error', function (err) {
console.error('something went fatally wrong and the stream was aborted', err);
})
.on('data', function (entry) {
console.log('%s is ready for processing', entry.path);
// process entry here
});

105
node_modules/readdirp/package.json generated vendored Normal file
View File

@ -0,0 +1,105 @@
{
"_args": [
[
"readdirp@^2.0.0",
"C:\\Users\\x2mjbyrn\\Source\\Repos\\Skeleton\\node_modules\\chokidar"
]
],
"_from": "readdirp@>=2.0.0-0 <3.0.0-0",
"_id": "readdirp@2.1.0",
"_inCache": true,
"_location": "/readdirp",
"_nodeVersion": "4.4.6",
"_npmOperationalInternal": {
"host": "packages-16-east.internal.npmjs.com",
"tmp": "tmp/readdirp-2.1.0.tgz_1467053820730_0.8782131769694388"
},
"_npmUser": {
"email": "thlorenz@gmx.de",
"name": "thlorenz"
},
"_npmVersion": "2.15.6",
"_phantomChildren": {},
"_requested": {
"name": "readdirp",
"raw": "readdirp@^2.0.0",
"rawSpec": "^2.0.0",
"scope": null,
"spec": ">=2.0.0-0 <3.0.0-0",
"type": "range"
},
"_requiredBy": [
"/chokidar"
],
"_resolved": "https://registry.npmjs.org/readdirp/-/readdirp-2.1.0.tgz",
"_shasum": "4ed0ad060df3073300c48440373f72d1cc642d78",
"_shrinkwrap": null,
"_spec": "readdirp@^2.0.0",
"_where": "C:\\Users\\x2mjbyrn\\Source\\Repos\\Skeleton\\node_modules\\chokidar",
"author": {
"email": "thlorenz@gmx.de",
"name": "Thorsten Lorenz",
"url": "thlorenz.com"
},
"bugs": {
"url": "https://github.com/thlorenz/readdirp/issues"
},
"dependencies": {
"graceful-fs": "^4.1.2",
"minimatch": "^3.0.2",
"readable-stream": "^2.0.2",
"set-immediate-shim": "^1.0.1"
},
"description": "Recursive version of fs.readdir with streaming api.",
"devDependencies": {
"nave": "^0.5.1",
"proxyquire": "^1.7.9",
"tap": "1.3.2",
"through2": "^2.0.0"
},
"directories": {},
"dist": {
"shasum": "4ed0ad060df3073300c48440373f72d1cc642d78",
"tarball": "https://registry.npmjs.org/readdirp/-/readdirp-2.1.0.tgz"
},
"engines": {
"node": ">=0.6"
},
"gitHead": "5a3751f86a1c2bbbb8e3a42685d4191992631e6c",
"homepage": "https://github.com/thlorenz/readdirp",
"installable": true,
"keywords": [
"filesystem",
"filter",
"find",
"fs",
"readdir",
"recursive",
"stream",
"streams"
],
"license": "MIT",
"main": "readdirp.js",
"maintainers": [
{
"name": "thlorenz",
"email": "thlorenz@gmx.de"
}
],
"name": "readdirp",
"optionalDependencies": {},
"repository": {
"type": "git",
"url": "git://github.com/thlorenz/readdirp.git"
},
"scripts": {
"test": "if [ -e $TRAVIS ]; then npm run test-all; else npm run test-main; fi",
"test-0.10": "nave use 0.10 npm run test-main",
"test-0.12": "nave use 0.12 npm run test-main",
"test-4": "nave use 4.4 npm run test-main",
"test-6": "nave use 6.2 npm run test-main",
"test-all": "npm run test-main && npm run test-0.10 && npm run test-0.12 && npm run test-4 && npm run test-6",
"test-main": "(cd test && set -e; for t in ./*.js; do node $t; done)"
},
"version": "2.1.0"
}

300
node_modules/readdirp/readdirp.js generated vendored Normal file
View File

@ -0,0 +1,300 @@
'use strict';
var fs = require('graceful-fs')
, path = require('path')
, minimatch = require('minimatch')
, toString = Object.prototype.toString
, si = require('set-immediate-shim')
;
// Standard helpers
function isFunction (obj) {
return toString.call(obj) === '[object Function]';
}
function isString (obj) {
return toString.call(obj) === '[object String]';
}
function isRegExp (obj) {
return toString.call(obj) === '[object RegExp]';
}
function isUndefined (obj) {
return obj === void 0;
}
/**
* Main function which ends up calling readdirRec and reads all files and directories in given root recursively.
* @param { Object } opts Options to specify root (start directory), filters and recursion depth
* @param { function } callback1 When callback2 is given calls back for each processed file - function (fileInfo) { ... },
* when callback2 is not given, it behaves like explained in callback2
* @param { function } callback2 Calls back once all files have been processed with an array of errors and file infos
* function (err, fileInfos) { ... }
*/
function readdir(opts, callback1, callback2) {
var stream
, handleError
, handleFatalError
, pending = 0
, errors = []
, readdirResult = {
directories: []
, files: []
}
, fileProcessed
, allProcessed
, realRoot
, aborted = false
, paused = false
;
// If no callbacks were given we will use a streaming interface
if (isUndefined(callback1)) {
var api = require('./stream-api')();
stream = api.stream;
callback1 = api.processEntry;
callback2 = api.done;
handleError = api.handleError;
handleFatalError = api.handleFatalError;
stream.on('close', function () { aborted = true; });
stream.on('pause', function () { paused = true; });
stream.on('resume', function () { paused = false; });
} else {
handleError = function (err) { errors.push(err); };
handleFatalError = function (err) {
handleError(err);
allProcessed(errors, null);
};
}
if (isUndefined(opts)){
handleFatalError(new Error (
'Need to pass at least one argument: opts! \n' +
'https://github.com/thlorenz/readdirp#options'
)
);
return stream;
}
opts.root = opts.root || '.';
opts.fileFilter = opts.fileFilter || function() { return true; };
opts.directoryFilter = opts.directoryFilter || function() { return true; };
opts.depth = typeof opts.depth === 'undefined' ? 999999999 : opts.depth;
opts.entryType = opts.entryType || 'files';
var statfn = opts.lstat === true ? fs.lstat.bind(fs) : fs.stat.bind(fs);
if (isUndefined(callback2)) {
fileProcessed = function() { };
allProcessed = callback1;
} else {
fileProcessed = callback1;
allProcessed = callback2;
}
function normalizeFilter (filter) {
if (isUndefined(filter)) return undefined;
function isNegated (filters) {
function negated(f) {
return f.indexOf('!') === 0;
}
var some = filters.some(negated);
if (!some) {
return false;
} else {
if (filters.every(negated)) {
return true;
} else {
// if we detect illegal filters, bail out immediately
throw new Error(
'Cannot mix negated with non negated glob filters: ' + filters + '\n' +
'https://github.com/thlorenz/readdirp#filters'
);
}
}
}
// Turn all filters into a function
if (isFunction(filter)) {
return filter;
} else if (isString(filter)) {
return function (entryInfo) {
return minimatch(entryInfo.name, filter.trim());
};
} else if (filter && Array.isArray(filter)) {
if (filter) filter = filter.map(function (f) {
return f.trim();
});
return isNegated(filter) ?
// use AND to concat multiple negated filters
function (entryInfo) {
return filter.every(function (f) {
return minimatch(entryInfo.name, f);
});
}
:
// use OR to concat multiple inclusive filters
function (entryInfo) {
return filter.some(function (f) {
return minimatch(entryInfo.name, f);
});
};
}
}
function processDir(currentDir, entries, callProcessed) {
if (aborted) return;
var total = entries.length
, processed = 0
, entryInfos = []
;
fs.realpath(currentDir, function(err, realCurrentDir) {
if (aborted) return;
if (err) {
handleError(err);
callProcessed(entryInfos);
return;
}
var relDir = path.relative(realRoot, realCurrentDir);
if (entries.length === 0) {
callProcessed([]);
} else {
entries.forEach(function (entry) {
var fullPath = path.join(realCurrentDir, entry)
, relPath = path.join(relDir, entry);
statfn(fullPath, function (err, stat) {
if (err) {
handleError(err);
} else {
entryInfos.push({
name : entry
, path : relPath // relative to root
, fullPath : fullPath
, parentDir : relDir // relative to root
, fullParentDir : realCurrentDir
, stat : stat
});
}
processed++;
if (processed === total) callProcessed(entryInfos);
});
});
}
});
}
function readdirRec(currentDir, depth, callCurrentDirProcessed) {
var args = arguments;
if (aborted) return;
if (paused) {
si(function () {
readdirRec.apply(null, args);
})
return;
}
fs.readdir(currentDir, function (err, entries) {
if (err) {
handleError(err);
callCurrentDirProcessed();
return;
}
processDir(currentDir, entries, function(entryInfos) {
var subdirs = entryInfos
.filter(function (ei) { return ei.stat.isDirectory() && opts.directoryFilter(ei); });
subdirs.forEach(function (di) {
if(opts.entryType === 'directories' || opts.entryType === 'both' || opts.entryType === 'all') {
fileProcessed(di);
}
readdirResult.directories.push(di);
});
entryInfos
.filter(function(ei) {
var isCorrectType = opts.entryType === 'all' ?
!ei.stat.isDirectory() : ei.stat.isFile() || ei.stat.isSymbolicLink();
return isCorrectType && opts.fileFilter(ei);
})
.forEach(function (fi) {
if(opts.entryType === 'files' || opts.entryType === 'both' || opts.entryType === 'all') {
fileProcessed(fi);
}
readdirResult.files.push(fi);
});
var pendingSubdirs = subdirs.length;
// Be done if no more subfolders exist or we reached the maximum desired depth
if(pendingSubdirs === 0 || depth === opts.depth) {
callCurrentDirProcessed();
} else {
// recurse into subdirs, keeping track of which ones are done
// and call back once all are processed
subdirs.forEach(function (subdir) {
readdirRec(subdir.fullPath, depth + 1, function () {
pendingSubdirs = pendingSubdirs - 1;
if(pendingSubdirs === 0) {
callCurrentDirProcessed();
}
});
});
}
});
});
}
// Validate and normalize filters
try {
opts.fileFilter = normalizeFilter(opts.fileFilter);
opts.directoryFilter = normalizeFilter(opts.directoryFilter);
} catch (err) {
// if we detect illegal filters, bail out immediately
handleFatalError(err);
return stream;
}
// If filters were valid get on with the show
fs.realpath(opts.root, function(err, res) {
if (err) {
handleFatalError(err);
return stream;
}
realRoot = res;
readdirRec(opts.root, 0, function () {
// All errors are collected into the errors array
if (errors.length > 0) {
allProcessed(errors, readdirResult);
} else {
allProcessed(null, readdirResult);
}
});
});
return stream;
}
module.exports = readdir;

99
node_modules/readdirp/stream-api.js generated vendored Normal file
View File

@ -0,0 +1,99 @@
'use strict';
var si = require('set-immediate-shim');
var stream = require('readable-stream');
var util = require('util');
var Readable = stream.Readable;
module.exports = ReaddirpReadable;
util.inherits(ReaddirpReadable, Readable);
function ReaddirpReadable (opts) {
if (!(this instanceof ReaddirpReadable)) return new ReaddirpReadable(opts);
opts = opts || {};
opts.objectMode = true;
Readable.call(this, opts);
// backpressure not implemented at this point
this.highWaterMark = Infinity;
this._destroyed = false;
this._paused = false;
this._warnings = [];
this._errors = [];
this._pauseResumeErrors();
}
var proto = ReaddirpReadable.prototype;
proto._pauseResumeErrors = function () {
var self = this;
self.on('pause', function () { self._paused = true });
self.on('resume', function () {
if (self._destroyed) return;
self._paused = false;
self._warnings.forEach(function (err) { self.emit('warn', err) });
self._warnings.length = 0;
self._errors.forEach(function (err) { self.emit('error', err) });
self._errors.length = 0;
})
}
// called for each entry
proto._processEntry = function (entry) {
if (this._destroyed) return;
this.push(entry);
}
proto._read = function () { }
proto.destroy = function () {
// when stream is destroyed it will emit nothing further, not even errors or warnings
this.push(null);
this.readable = false;
this._destroyed = true;
this.emit('close');
}
proto._done = function () {
this.push(null);
}
// we emit errors and warnings async since we may handle errors like invalid args
// within the initial event loop before any event listeners subscribed
proto._handleError = function (err) {
var self = this;
si(function () {
if (self._paused) return self._warnings.push(err);
if (!self._destroyed) self.emit('warn', err);
});
}
proto._handleFatalError = function (err) {
var self = this;
si(function () {
if (self._paused) return self._errors.push(err);
if (!self._destroyed) self.emit('error', err);
});
}
function createStreamAPI () {
var stream = new ReaddirpReadable();
return {
stream : stream
, processEntry : stream._processEntry.bind(stream)
, done : stream._done.bind(stream)
, handleError : stream._handleError.bind(stream)
, handleFatalError : stream._handleFatalError.bind(stream)
};
}
module.exports = createStreamAPI;

View File

View File

View File

View File

View File

0
node_modules/readdirp/test/bed/root_file1.ext1 generated vendored Normal file
View File

0
node_modules/readdirp/test/bed/root_file2.ext2 generated vendored Normal file
View File

0
node_modules/readdirp/test/bed/root_file3.ext3 generated vendored Normal file
View File

338
node_modules/readdirp/test/readdirp-stream.js generated vendored Normal file
View File

@ -0,0 +1,338 @@
/*jshint asi:true */
var debug //= true;
var test = debug ? function () {} : require('tap').test
var test_ = !debug ? function () {} : require('tap').test
, path = require('path')
, fs = require('fs')
, util = require('util')
, TransformStream = require('readable-stream').Transform
, through = require('through2')
, proxyquire = require('proxyquire')
, streamapi = require('../stream-api')
, readdirp = require('..')
, root = path.join(__dirname, 'bed')
, totalDirs = 6
, totalFiles = 12
, ext1Files = 4
, ext2Files = 3
, ext3Files = 2
;
// see test/readdirp.js for test bed layout
function opts (extend) {
var o = { root: root };
if (extend) {
for (var prop in extend) {
o[prop] = extend[prop];
}
}
return o;
}
function capture () {
var result = { entries: [], errors: [], ended: false }
, dst = new TransformStream({ objectMode: true });
dst._transform = function (entry, _, cb) {
result.entries.push(entry);
cb();
}
dst._flush = function (cb) {
result.ended = true;
this.push(result);
cb();
}
return dst;
}
test('\nintegrated', function (t) {
t.test('\n# reading root without filter', function (t) {
t.plan(2);
readdirp(opts())
.on('error', function (err) {
t.fail('should not throw error', err);
})
.pipe(capture())
.pipe(through.obj(
function (result, _ , cb) {
t.equals(result.entries.length, totalFiles, 'emits all files');
t.ok(result.ended, 'ends stream');
t.end();
cb();
}
));
})
t.test('\n# normal: ["*.ext1", "*.ext3"]', function (t) {
t.plan(2);
readdirp(opts( { fileFilter: [ '*.ext1', '*.ext3' ] } ))
.on('error', function (err) {
t.fail('should not throw error', err);
})
.pipe(capture())
.pipe(through.obj(
function (result, _ , cb) {
t.equals(result.entries.length, ext1Files + ext3Files, 'all ext1 and ext3 files');
t.ok(result.ended, 'ends stream');
t.end();
cb();
}
))
})
t.test('\n# files only', function (t) {
t.plan(2);
readdirp(opts( { entryType: 'files' } ))
.on('error', function (err) {
t.fail('should not throw error', err);
})
.pipe(capture())
.pipe(through.obj(
function (result, _ , cb) {
t.equals(result.entries.length, totalFiles, 'returned files');
t.ok(result.ended, 'ends stream');
t.end();
cb();
}
))
})
t.test('\n# directories only', function (t) {
t.plan(2);
readdirp(opts( { entryType: 'directories' } ))
.on('error', function (err) {
t.fail('should not throw error', err);
})
.pipe(capture())
.pipe(through.obj(
function (result, _ , cb) {
t.equals(result.entries.length, totalDirs, 'returned directories');
t.ok(result.ended, 'ends stream');
t.end();
cb();
}
))
})
t.test('\n# both directories + files', function (t) {
t.plan(2);
readdirp(opts( { entryType: 'both' } ))
.on('error', function (err) {
t.fail('should not throw error', err);
})
.pipe(capture())
.pipe(through.obj(
function (result, _ , cb) {
t.equals(result.entries.length, totalDirs + totalFiles, 'returned everything');
t.ok(result.ended, 'ends stream');
t.end();
cb();
}
))
})
t.test('\n# directory filter with directories only', function (t) {
t.plan(2);
readdirp(opts( { entryType: 'directories', directoryFilter: [ 'root_dir1', '*dir1_subdir1' ] } ))
.on('error', function (err) {
t.fail('should not throw error', err);
})
.pipe(capture())
.pipe(through.obj(
function (result, _ , cb) {
t.equals(result.entries.length, 2, 'two directories');
t.ok(result.ended, 'ends stream');
t.end();
cb();
}
))
})
t.test('\n# directory and file filters with both entries', function (t) {
t.plan(2);
readdirp(opts( { entryType: 'both', directoryFilter: [ 'root_dir1', '*dir1_subdir1' ], fileFilter: [ '!*.ext1' ] } ))
.on('error', function (err) {
t.fail('should not throw error', err);
})
.pipe(capture())
.pipe(through.obj(
function (result, _ , cb) {
t.equals(result.entries.length, 6, '2 directories and 4 files');
t.ok(result.ended, 'ends stream');
t.end();
cb();
}
))
})
t.test('\n# negated: ["!*.ext1", "!*.ext3"]', function (t) {
t.plan(2);
readdirp(opts( { fileFilter: [ '!*.ext1', '!*.ext3' ] } ))
.on('error', function (err) {
t.fail('should not throw error', err);
})
.pipe(capture())
.pipe(through.obj(
function (result, _ , cb) {
t.equals(result.entries.length, totalFiles - ext1Files - ext3Files, 'all but ext1 and ext3 files');
t.ok(result.ended, 'ends stream');
t.end();
}
))
})
t.test('\n# no options given', function (t) {
t.plan(1);
readdirp()
.on('error', function (err) {
t.similar(err.toString() , /Need to pass at least one argument/ , 'emits meaningful error');
t.end();
})
})
t.test('\n# mixed: ["*.ext1", "!*.ext3"]', function (t) {
t.plan(1);
readdirp(opts( { fileFilter: [ '*.ext1', '!*.ext3' ] } ))
.on('error', function (err) {
t.similar(err.toString() , /Cannot mix negated with non negated glob filters/ , 'emits meaningful error');
t.end();
})
})
})
test('\napi separately', function (t) {
t.test('\n# handleError', function (t) {
t.plan(1);
var api = streamapi()
, warning = new Error('some file caused problems');
api.stream
.on('warn', function (err) {
t.equals(err, warning, 'warns with the handled error');
})
api.handleError(warning);
})
t.test('\n# when stream is paused and then resumed', function (t) {
t.plan(6);
var api = streamapi()
, resumed = false
, fatalError = new Error('fatal!')
, nonfatalError = new Error('nonfatal!')
, processedData = 'some data'
;
api.stream
.on('warn', function (err) {
t.equals(err, nonfatalError, 'emits the buffered warning');
t.ok(resumed, 'emits warning only after it was resumed');
})
.on('error', function (err) {
t.equals(err, fatalError, 'emits the buffered fatal error');
t.ok(resumed, 'emits errors only after it was resumed');
})
.on('data', function (data) {
t.equals(data, processedData, 'emits the buffered data');
t.ok(resumed, 'emits data only after it was resumed');
})
.pause()
api.processEntry(processedData);
api.handleError(nonfatalError);
api.handleFatalError(fatalError);
setTimeout(function () {
resumed = true;
api.stream.resume();
}, 1)
})
t.test('\n# when a stream is paused it stops walking the fs', function (t) {
var resumed = false,
mockedAPI = streamapi();
mockedAPI.processEntry = function (entry) {
if (!resumed) t.notOk(true, 'should not emit while paused')
t.ok(entry, 'emitted while resumed')
}.bind(mockedAPI.stream)
function wrapper () {
return mockedAPI
}
var readdirp = proxyquire('../readdirp', {'./stream-api': wrapper})
, stream = readdirp(opts())
.on('error', function (err) {
t.fail('should not throw error', err);
})
.on('end', function () {
t.end()
})
.pause();
setTimeout(function () {
resumed = true;
stream.resume();
}, 5)
})
t.test('\n# when a stream is destroyed, it emits "closed", but no longer emits "data", "warn" and "error"', function (t) {
var api = streamapi()
, fatalError = new Error('fatal!')
, nonfatalError = new Error('nonfatal!')
, processedData = 'some data'
, plan = 0;
t.plan(6)
var stream = api.stream
.on('warn', function (err) {
t.ok(!stream._destroyed, 'emits warning until destroyed');
})
.on('error', function (err) {
t.ok(!stream._destroyed, 'emits errors until destroyed');
})
.on('data', function (data) {
t.ok(!stream._destroyed, 'emits data until destroyed');
})
.on('close', function () {
t.ok(stream._destroyed, 'emits close when stream is destroyed');
})
api.processEntry(processedData);
api.handleError(nonfatalError);
api.handleFatalError(fatalError);
setTimeout(function () {
stream.destroy()
t.notOk(stream.readable, 'stream is no longer readable after it is destroyed')
api.processEntry(processedData);
api.handleError(nonfatalError);
api.handleFatalError(fatalError);
process.nextTick(function () {
t.pass('emits no more data, warn or error events after it was destroyed')
t.end();
})
}, 10)
})
})

289
node_modules/readdirp/test/readdirp.js generated vendored Normal file
View File

@ -0,0 +1,289 @@
/*jshint asi:true */
var test = require('tap').test
, path = require('path')
, fs = require('fs')
, util = require('util')
, net = require('net')
, readdirp = require('../readdirp.js')
, root = path.join(__dirname, '../test/bed')
, totalDirs = 6
, totalFiles = 12
, ext1Files = 4
, ext2Files = 3
, ext3Files = 2
, rootDir2Files = 2
, nameHasLength9Dirs = 2
, depth1Files = 8
, depth0Files = 3
;
/*
Structure of test bed:
.
├── root_dir1
│   ├── root_dir1_file1.ext1
│   ├── root_dir1_file2.ext2
│   ├── root_dir1_file3.ext3
│   ├── root_dir1_subdir1
│   │   └── root1_dir1_subdir1_file1.ext1
│   └── root_dir1_subdir2
│   └── .gitignore
├── root_dir2
│   ├── root_dir2_file1.ext1
│   ├── root_dir2_file2.ext2
│   ├── root_dir2_subdir1
│   │   └── .gitignore
│   └── root_dir2_subdir2
│   └── .gitignore
├── root_file1.ext1
├── root_file2.ext2
└── root_file3.ext3
6 directories, 13 files
*/
// console.log('\033[2J'); // clear console
function opts (extend) {
var o = { root: root };
if (extend) {
for (var prop in extend) {
o[prop] = extend[prop];
}
}
return o;
}
test('\nreading root without filter', function (t) {
t.plan(2);
readdirp(opts(), function (err, res) {
t.equals(res.directories.length, totalDirs, 'all directories');
t.equals(res.files.length, totalFiles, 'all files');
t.end();
})
})
test('\nreading root without filter using lstat', function (t) {
t.plan(2);
readdirp(opts({ lstat: true }), function (err, res) {
t.equals(res.directories.length, totalDirs, 'all directories');
t.equals(res.files.length, totalFiles, 'all files');
t.end();
})
})
test('\nreading root with symlinks using lstat', function (t) {
t.plan(2);
fs.symlinkSync(path.join(root, 'root_dir1'), path.join(root, 'dirlink'));
fs.symlinkSync(path.join(root, 'root_file1.ext1'), path.join(root, 'link.ext1'));
readdirp(opts({ lstat: true }), function (err, res) {
t.equals(res.directories.length, totalDirs, 'all directories');
t.equals(res.files.length, totalFiles + 2, 'all files + symlinks');
fs.unlinkSync(path.join(root, 'dirlink'));
fs.unlinkSync(path.join(root, 'link.ext1'));
t.end();
})
})
test('\nreading non-standard fds', function (t) {
t.plan(2);
var server = net.createServer().listen(path.join(root, 'test.sock'), function(){
readdirp(opts({ entryType: 'all' }), function (err, res) {
t.equals(res.files.length, totalFiles + 1, 'all files + socket');
readdirp(opts({ entryType: 'both' }), function (err, res) {
t.equals(res.files.length, totalFiles, 'all regular files only');
server.close();
t.end();
})
})
});
})
test('\nreading root using glob filter', function (t) {
// normal
t.test('\n# "*.ext1"', function (t) {
t.plan(1);
readdirp(opts( { fileFilter: '*.ext1' } ), function (err, res) {
t.equals(res.files.length, ext1Files, 'all ext1 files');
t.end();
})
})
t.test('\n# ["*.ext1", "*.ext3"]', function (t) {
t.plan(1);
readdirp(opts( { fileFilter: [ '*.ext1', '*.ext3' ] } ), function (err, res) {
t.equals(res.files.length, ext1Files + ext3Files, 'all ext1 and ext3 files');
t.end();
})
})
t.test('\n# "root_dir1"', function (t) {
t.plan(1);
readdirp(opts( { directoryFilter: 'root_dir1' }), function (err, res) {
t.equals(res.directories.length, 1, 'one directory');
t.end();
})
})
t.test('\n# ["root_dir1", "*dir1_subdir1"]', function (t) {
t.plan(1);
readdirp(opts( { directoryFilter: [ 'root_dir1', '*dir1_subdir1' ]}), function (err, res) {
t.equals(res.directories.length, 2, 'two directories');
t.end();
})
})
t.test('\n# negated: "!*.ext1"', function (t) {
t.plan(1);
readdirp(opts( { fileFilter: '!*.ext1' } ), function (err, res) {
t.equals(res.files.length, totalFiles - ext1Files, 'all but ext1 files');
t.end();
})
})
t.test('\n# negated: ["!*.ext1", "!*.ext3"]', function (t) {
t.plan(1);
readdirp(opts( { fileFilter: [ '!*.ext1', '!*.ext3' ] } ), function (err, res) {
t.equals(res.files.length, totalFiles - ext1Files - ext3Files, 'all but ext1 and ext3 files');
t.end();
})
})
t.test('\n# mixed: ["*.ext1", "!*.ext3"]', function (t) {
t.plan(1);
readdirp(opts( { fileFilter: [ '*.ext1', '!*.ext3' ] } ), function (err, res) {
t.similar(err[0].toString(), /Cannot mix negated with non negated glob filters/, 'returns meaningfull error');
t.end();
})
})
t.test('\n# leading and trailing spaces: [" *.ext1", "*.ext3 "]', function (t) {
t.plan(1);
readdirp(opts( { fileFilter: [ ' *.ext1', '*.ext3 ' ] } ), function (err, res) {
t.equals(res.files.length, ext1Files + ext3Files, 'all ext1 and ext3 files');
t.end();
})
})
t.test('\n# leading and trailing spaces: [" !*.ext1", " !*.ext3 "]', function (t) {
t.plan(1);
readdirp(opts( { fileFilter: [ ' !*.ext1', ' !*.ext3' ] } ), function (err, res) {
t.equals(res.files.length, totalFiles - ext1Files - ext3Files, 'all but ext1 and ext3 files');
t.end();
})
})
t.test('\n# ** glob pattern', function (t) {
t.plan(1);
readdirp(opts( { fileFilter: '**/*.ext1' } ), function (err, res) {
t.equals(res.files.length, ext1Files, 'ignores ** in **/*.ext1 -> only *.ext1 files');
t.end();
})
})
})
test('\n\nreading root using function filter', function (t) {
t.test('\n# file filter -> "contains root_dir2"', function (t) {
t.plan(1);
readdirp(
opts( { fileFilter: function (fi) { return fi.name.indexOf('root_dir2') >= 0; } })
, function (err, res) {
t.equals(res.files.length, rootDir2Files, 'all rootDir2Files');
t.end();
}
)
})
t.test('\n# directory filter -> "name has length 9"', function (t) {
t.plan(1);
readdirp(
opts( { directoryFilter: function (di) { return di.name.length === 9; } })
, function (err, res) {
t.equals(res.directories.length, nameHasLength9Dirs, 'all all dirs with name length 9');
t.end();
}
)
})
})
test('\nreading root specifying maximum depth', function (t) {
t.test('\n# depth 1', function (t) {
t.plan(1);
readdirp(opts( { depth: 1 } ), function (err, res) {
t.equals(res.files.length, depth1Files, 'does not return files at depth 2');
})
})
})
test('\nreading root with no recursion', function (t) {
t.test('\n# depth 0', function (t) {
t.plan(1);
readdirp(opts( { depth: 0 } ), function (err, res) {
t.equals(res.files.length, depth0Files, 'does not return files at depth 0');
})
})
})
test('\nprogress callbacks', function (t) {
t.plan(2);
var pluckName = function(fi) { return fi.name; }
, processedFiles = [];
readdirp(
opts()
, function(fi) {
processedFiles.push(fi);
}
, function (err, res) {
t.equals(processedFiles.length, res.files.length, 'calls back for each file processed');
t.deepEquals(processedFiles.map(pluckName).sort(),res.files.map(pluckName).sort(), 'same file names');
t.end();
}
)
})
test('resolving of name, full and relative paths', function (t) {
var expected = {
name : 'root_dir1_file1.ext1'
, parentDirName : 'root_dir1'
, path : 'root_dir1/root_dir1_file1.ext1'
, fullPath : 'test/bed/root_dir1/root_dir1_file1.ext1'
}
, opts = [
{ root: './bed' , prefix: '' }
, { root: './bed/' , prefix: '' }
, { root: 'bed' , prefix: '' }
, { root: 'bed/' , prefix: '' }
, { root: '../test/bed/' , prefix: '' }
, { root: '.' , prefix: 'bed' }
]
t.plan(opts.length);
opts.forEach(function (op) {
op.fileFilter = 'root_dir1_file1.ext1';
t.test('\n' + util.inspect(op), function (t) {
t.plan(4);
readdirp (op, function(err, res) {
t.equals(res.files[0].name, expected.name, 'correct name');
t.equals(res.files[0].path, path.join(op.prefix, expected.path), 'correct path');
})
fs.realpath(op.root, function(err, fullRoot) {
readdirp (op, function(err, res) {
t.equals(
res.files[0].fullParentDir
, path.join(fullRoot, op.prefix, expected.parentDirName)
, 'correct parentDir'
);
t.equals(
res.files[0].fullPath
, path.join(fullRoot, op.prefix, expected.parentDirName, expected.name)
, 'correct fullPath'
);
})
})
})
})
})