Tag Archives: typescript

Reusable date range validator

Note: This post describes my developer experience, the step I have to take to make it work. I am not describing the final result but also the problems I have experienced and how I dealt with them. In the development process, the issues are snowballing and spiralling out of control. Also, this is a follow-up post to the previous post.

The validator validates catalogue object has properties – validFrom and validTo – to make it valid only in this date range. None of these properties is required so that the catalogue can be active from the specific day forward or till some date. Or forever. But the date range has to be valid date range and end of validity has to be after the start of validity. The date range is also validated only when the catalogue is active. That information is stored in isActive property, named initially isEnabled but renamed later.

The validator is implemented on the server-side using a custom validator and added to properties of the input type as @Validate(DateFromToValidator) decorator.

@InputType()
export class CatalogInput {
  ...
  @Validate(DateFromToValidator)
  @Field({ nullable: true })
  validFrom?: Date;
  @Validate(DateFromToValidator)
  @Field({ nullable: true })
  validTo?: Date;
  @Field()
  isActive: boolean;
}

The of the validator from the previous post

@ValidatorConstraint({ name: "dateFromToValidator", async: false })
export class DateFromToValidator implements ValidatorConstraintInterface {
  validate(value: string, args: ValidationArguments) {
    const catalog = args.object as CatalogInput;
    if (!catalog.isEnabled) {
      return true;
    }

    return catalog.validFrom <= catalog.validTo;
  }

  defaultMessage(args: ValidationArguments) {
    return "Text is too short or too long!";
  }
}

You may have noticed that the DateFromToValidator contains bugs.

  1. it does not work when only one of the range values is provided
  2. the error message is non-sensical
  3. It has to split into two constraints to show a proper validation message for each property
  4. The validator is not reusable

It happened during the development while dealing with the other snowballed issues and I have completely forgotten about that.

@ValidatorConstraint({ name: "dateFromValidator", async: false })
export class DateFromValidator implements ValidatorConstraintInterface {
  validate(value: string, args: ValidationArguments) {
    const catalog = args.object as CatalogInput;
    if (!catalog.isEnabled) {
      return true;
    }

    return catalog.validFrom <= catalog.validTo;
  }

  defaultMessage(args: ValidationArguments) {
    return "Text is too short or too long!";
  }
}

Reusability

The steps to make the validator reusable was an introduction of a new interface IHasValidityRestriction and getting rid of the dependency on the value of isEnabled property.

export interface IHasValidityRestriction {
  validFrom?: Date;
  validTo?: Date;
}

Then any class implementing this interface can be validated:

@ValidatorConstraint({ name: "dateRangeValidator", async: false })
export class DateRangeValidator implements ValidatorConstraintInterface {
  constructor(private message: string) {}

  validate(value: string, args: ValidationArguments) {
    const o = args.object as IHasValidityRestriction;
    // only when validFrom and validTo values has been supplied
    if (o.validFrom && o.validTo) { 
      return o.validFrom <= o.validTo;
    }

    return true;
  }

  defaultMessage(args: ValidationArguments) { return this.message; }
}

The class-validator library has a neat conditional decorator ValidateIf that can ignore the validators on a property when the provided condition function returns false. In this case, the date range is validated when isEnabled is true.

Decorators

I have also created two decorators to validate validFrom and validTo properties; each of them has a different constraint violation message.

export function DateRangeStart(validationOptions?: ValidationOptions) {
  return function(object: Object, propertyName: string) {
    registerDecorator({
      name: "dateRangeStartValidator",
      target: object.constructor,
      propertyName: propertyName,
      constraints: [],
      options: validationOptions,
      validator: new DateRangeValidator(
        `${propertyName} must be before date range end`
      )
    });
  };
}

export function DateRangeEnd(validationOptions?: ValidationOptions) {
  return function(object: Object, propertyName: string) {
    registerDecorator({
      name: "dateRangeEndValidator",
      target: object.constructor,
      propertyName: propertyName,
      constraints: [],
      options: validationOptions,
      validator: new DateRangeValidator(
        `${propertyName} must be after date range start`
      )
    });
  };
}

Result

And this is the result when everything is put together:

@InputType()
export class CatalogInput implements IHasValidityRestriction {
  @ValidateIf(o => o.isEnabled)
  @DateRangeStart()
  @Field({ nullable: true })
  validFrom?: Date;
  
  @ValidateIf(o => o.isEnabled)
  @DateRangeEnd()
  @Field({ nullable: true })
  validTo?: Date;
  
  @Field()
  isEnabled: boolean;
}

Notes

I was trying to improve the UX and disable the form submit button when there is a field with an invalid value, but this does not work in this case. While I can change the dates for the date range to be valid, the fields remain invalid until the next server validation.

<Button
  type="primary"
  disabled={hasError}
  htmlType="submit">
  Submit
</Button>

And the hasError value is detected from the form itself.

const fieldErrors = form.getFieldsError();
const hasError = Object.keys(fieldErrors).find(p => fieldErrors[p] !== undefined) !== undefined;

The (again) ugly fix was to reset the errors for the date range on submitting explicitly.

handleSubmit = async (catalog: ICatalog, form: WrappedFormUtils<any>) => {
  ...
  form.setFields({
    validFrom: { value: form.getFieldValue("validFrom") },
    validTo: { value: form.getFieldValue("validTo") }
  });
  ...
}

Is my code going to be full of ugly hacks? I certainly hope note. Some might still argue, but a much better fix is to reset the field errors when the date range fields change. The handler has to reset input field errors for both input fields because the validator invalidates both. And the onChange handler has been added to both input fields.

<DatePicker
  onChange={() =>
    form.setFields({
      validTo: { value: form.getFieldValue("validTo") },
      validFrom: { value: form.getFieldValue("validFrom") }
    })
  } />

Final thoughts

Developers who know how similar controls work would certainly avoid half of the discusses problems and use the DatePicker disabledDate property to limit the start and end dates. Using it will improve the UX on the client-side, but data provided by the user has to be validated.

{getFieldDecorator("validTo", {
  initialValue: momentFunc(catalog.validTo) 
})(<DatePicker
  onChange={() =>
    form.setFields({
      validTo: { value: form.getFieldValue("validTo") },
      validFrom: { value: form.getFieldValue("validFrom") }
    })  
  } 
  disabledDate={this.disabledValidTo}
/>)}
disabledValidTo = (validTo?: moment.Moment): boolean => {
  const validFrom = this.props.form.getFieldValue("validFrom");
  if (validTo && validFrom) {
    return validTo.valueOf() <= validFrom.valueOf();
  }
  return false;
};

Communicating server-side input validation failures with GraphQL and ant-design form

Note: This post describes my developer experience, the step I have to take to make it work. I am not describing the final result but also the problems I have experienced and how I dealt with them. In the development process, the issues are snowballing and spiralling out of control.

When I have chosen a completely different programming language (TypeScript) and server-side API approach (GraphQL) and different React UI library (ant-design) for the development of my application, I did not know how it will slow me down. Every new feature I wanted to implement (including the most basic ones) resulted in spend some time researching using Google, StackOverflow and GitHub pages. This time it was no different – server-side validation and communicating the input validation failure to the user.

The form

The catalog object has two properties – validFrom and validTo – to make it valid only in this date range. None of these properties is required so that the catalog can be active from the specific day forward or till some date. Or forever. But the date range has to be valid date range and end of validity has to be after the start of validity.

The validator is implemented on the server-side using a custom validator and added to properties of the input type as @Validate(DateFromToValidator) decorator.

@InputType()
export class CatalogInput {
  @Field({ nullable: true })
  id?: number;
  @Field()
  name: string;
  @MaxLength(255)
  @Field({ nullable: true })
  description?: string;
  @Validate(DateFromToValidator)
  @Field({ nullable: true })
  validFrom?: Date;
  @Validate(DateFromToValidator)
  @Field({ nullable: true })
  validTo?: Date;
  @Field()
  isPublic: boolean;
  @Field()
  isEnabled: boolean;
}

The library is using class-validator and I can create a custom validator constraint implementing ValidatorConstraintInterface interface. The interface has two methods

  • validate that should return true when everything is ok and
  • defaultMessage to return the error message.

The code of the validation constraint (not reusable) to validate validFrom and validTo date range is as follows:

@ValidatorConstraint({ name: "dateFromToValidator", async: false })
export class DateFromToValidator implements ValidatorConstraintInterface {
  validate(value: string, args: ValidationArguments) {
    const catalog = args.object as CatalogInput;
    if (!catalog.isEnabled) {
      return true;
    }

    return catalog.validFrom <= catalog.validTo;
  }

  defaultMessage(args: ValidationArguments) {
    return "Text is too short or too long!";
  }
}

You can argue that I could have implemented the validation on the client. Still, the golden rule of API design is never to trust the values provided by the user and always validate the data on the server-side. While this may improve the UX, it also means to have two implementations of the validator.

You may also say that is not a problem since the code can work on both browser and server-side and you may be right. But for now, let’s leave like it is 🙂 I will change it in the future.

When the validation fails on server-side, GraphQL API returns a response with errors array indicating a problem. The response has a 200 HTTP status code, unlike REST. The error response is well structured and contains some additional information like stack trace. The Apollo server introduced standardized errors where additional details are provided in the extensions map 12.

{
  "errors": [
    {
      "message": "Argument Validation Error!",
      "locations": [
        {
          "line": 2,
          "column": 3
        }
      ],
      "path": [
        "updateCatalog"
      ],
      "extensions": {
        "code": "BAD_USER_INPUT",
        "exception": {
          "validationErrors": [
            {
              "target": {
                "id": 2,
                "name": "Name",
                "description": "Description",
                "validFrom": "2019-12-19T11:42:31.972Z",
                "validTo": null,
                "isPublic": false,
                "isEnabled": true
              },
              "value": "2019-12-19T11:42:31.972Z",
              "property": "dateFromToValidator",
              "children": [],
              "constraints": {
                "catalogValidity": "Text is too short or too long!"
              }
            }
          ],
          "stacktrace": [
            "UserInputError: Argument Validation Error!",
            "    at Object.exports.translateError (C:\\Work\\playground\\app\\server\\src\\modules\\shared\\ErrorInterceptor.ts:69:11)",
            "    at C:\\Work\\playground\\app\\server\\src\\modules\\shared\\ErrorInterceptor.ts:29:13",
            "    at Generator.throw (<anonymous>)",
            "    at rejected (C:\\Work\\playground\\app\\server\\src\\modules\\shared\\ErrorInterceptor.ts:6:65)",
            "    at process._tickCallback (internal/process/next_tick.js:68:7)"
          ]
        }
      }
    }
  ],
  "data": null
}

To display the validation errors in the ant-design form, I have to convert the GraphQL error response to an object that can be passed to setFields method of the form. The signature of the method is setFields(obj: Object): void; which is not very helpful. The search on ant-design GitHub pages showed that the object passed must have the same properties as the edited object. Each member is another object with a property value with the edited object’ value and optional property errors containing the error(s) to be displayed.

const formFields: { [property: string]: { value: any, errors?: Error[] } } = {};

The failed attempt to mutate the data throws an exception – the response is rejected in Apollo. The error handler is a switch with cases for possible error codes. Only user input errors are handled here (BAD_USER_INPUT extension code).

try {
  await this.props.client.mutate<CatalogData>({
    mutation: UPDATE_CATALOG,
    variables: {
      catalog
    }
  });
} catch (e) {
  const apolloError = e as ApolloError;
  if (apolloError) {
    apolloError.graphQLErrors.forEach(apolloError => {
      const code = apolloError.extensions.code;
      switch (code) {
        case "BAD_USER_INPUT":
          const validationErrors = apolloError.extensions.exception.validationErrors;
          const formFields: { [property: string]: any } = {};
          validationErrors.forEach(
            (validationError: ApolloValidationError) => {
              const {
                target,
                property,
                constraints,
                value
              } = validationError;

              const errors = [];
              for (const key in constraints) {
                const value = constraints[key];
                errors.push(new Error(value));
              }

              formFields[property] = {
                value,
                errors
              };
            }
          );
          setTimeout(() => {
            form.setFields(formFields);
          }, 500);
          break;
        default:
          this.handleError(e);
          break;
      }
    });
  }
}

And this object will passed into setFields method:

{
  "validFrom": {
    "value": "2019-12-18T12:31:42.487Z",
    "errors": [
      Error("Text is too short or too long!")
    ]
  },
}

This code does not work – the DatePicker control expects a value of moment type, and it gets string instead. This attempt ends in warnings being written to a console and an exception thrown:

Warning: Failed prop type: Invalid prop `value` of type `string` supplied to `CalendarMixinWrapper`, expected `object`.
Warning: Failed prop type: Invalid prop `value` supplied to `Picker`.
TypeError: value.format is not a function

When these input fields are rendered the value provided is explicitly converted from Date to moment instance:

const momentFunc = (value?: Date) => moment(value).isValid() ? moment(value) : null;

<Form.Item label="Platnost do">
  <Row>
    <Col>
      {getFieldDecorator("validTo", { initialValue: momentFunc(catalog.validTo) })
        (<DatePicker locale={locale} />)
      }
    </Col>
  </Row>
</Form.Item>

The form is re-rendered, but the initialiValue is not recalculated.

The quick and ugly hack is to convert the string values representing dates into moment instances. It is ugly because I have to list the names of properties holding date values. Also, I can’t use this as a general solution:

const momentFunc = (value?: string) => moment(value).isValid() ? moment(value) : null;

let v = value;
if (key === "validTo" || key === "validFrom") { 
  v = momentFunc(value);
}

const errors = [];
for (const key in constraints) {
  const value = constraints[key];
  errors.push(new Error(value));
}

formFields[property] = { value: v, errors };

This works now with a minor problem – the validFrom input field is cleared on form submit and comes back with the validation failure message. Oops, I have been accidentally calling form.resetFields method in the submit handler.

Types exist only at compile time

Even though TypeScript brings optional static type-checking, it is only performed at compile time. The type information (the metadata) does not become a part of the compiled JavaScript code (unlike in C# or Java). However, the code elements can be extended with decorators to provide information in runtime. 3

Good news! I have already decorators in place – I have decorated the properties of CatalogInput class. This type is used for updating (mutating) the catalog instance through the GraphQL API. Bad news – this class is located in the server project which is parallel to the client project 🙁 I will write a separate post on this once I figure out how to achieve code sharing between client and server.

For now the ugly hack is to try to convert given value to a moment instance:

const isValidDate = (dateObject: string) => new Date(dateObject).toString() !== "Invalid Date";

let v = value;
if (isValidDate(v)) {
  v = momentFunc(value);
}

const errors = [];
for (const key in constraints) {
  const value = constraints[key];
  errors.push(new Error(value));
}

formFields[property] = { value: v, errors };

A much better and robust solution would be to use the type information from the decorators on client-side or extend the GraphQL API response with type information (again, from member decorator on server-side).

TypeScript, GraphQL, Sequelize … OMG

For my new project I decided to use TypeScript in my new project instead. I got fed up with all those unexpected JavaScript run-time errors occurring in my previous projects. It was not a very pleasant journey – so many new things, steep learning curve, new syntax, but you get the type safety, which is nice.

All my previous project were using a REST API. I decided to give GraphQL a try. It was not a very pleasant journey – so many new things, steep learning curve, new syntax, Apollo, variables, no HTTP status codes to detect errors, etc., but you get the playground, which is nice.

Right now I doubt the benefits of using it. That was true when I started writing this post.

The basic idea is not to reinvent the wheel, so I chose graphql-modules NPM package for “easier” adoption. The library encourages to create reusable modules encapsulating functionality, has dependency injection support, and the modules should be testable (do not know, I haven’t got there).

The most important part is the resolver which is a function to resolve value for a type of field in a schema, it will start at the root object that is being returned by the query and resolve all the required fields. This is one of the features of GraphQL – you get what you want, not less, not more. If you do not ask for the field, it will not appear in the result (compare that with REST). The fancy terms are under-fetching and over-fetching.

But, back to code, I have two entities User with many Roles. In User model the association is defined as

@BelongsToMany(() => Role, () => UserRole) roles: Role[];

Auto-generated schema from these types

@ObjectType()          
export class User {
  @Field(type => ID) id: number; 
  @Field() firstName: string;
  @Field() lastName: string;
  @Field() emailAddress: string;
  @Field() suspended: boolean;
  @Field() confirmed: boolean;
  @Field() deleted: boolean;
}          

@ObjectType()
export class Role {
  @Field() name: string;          
  @Field() description: string;
}          

@ObjectType()
export class QueryResult {
  @Field() page: number;
  @Field() rowsPerPage: number;
  @Field() totalCount: number;
  @Field(type => [User]) records: User[];
}          

@ArgsType()
export class SearchUsersArgs {
  @Field({ nullable: true }) sortBy?: string;          
  @Field(type => Boolean, { nullable: true }) sortDirection?: Boolean;
 
  @Field(type => Int, { defaultValue: 1, nullable: true })
  @Min(0) page: number = 1;          

  @Field(type => Int, { nullable: true })
  @Min(10)
  @Max(50)
  pageSize = 20;
}

and the resolver has two parametrized queries – one to return a list of users

@Query(returns => QueryResult)
async users(@Args() input: SearchUsersArgs) {
  return await this.UsersProvider.getUsers(input);
}

and another to find one user by id

@Query(returns => User)
async user(@Arg("id", type => Int) id: number) {
  const user = await this.UsersProvider.getUserById(id);
  if (user) { 
    return user;
   }
  throw new NotFoundError("User with given id was not found.");
}

The user properties like name, emailAddress, etc. are resolved by default just by accessing the object properties. For the roles the situation is different. The data is stored in different table in MySQL database. The first idea was to JOIN the data and to include the roles in the query, get the roles along with the users in single database roundtrip.

const users = await User.findAll({
 attributes: {
   exclude: ["password", "password_confirmation", "password_digest"]
 },
 include: [{ model: Role }],
 offset: (page - 1) * pageSize,
 limit: pageSize,
 order: [[sortBy, sortDirection]]
});

I have quickly dismissed this approach as I can’t tell in advance whether the user wants to include the roles in the result set and I may be loading the data from database and not use them later (over-fetching). Even though now there is only one association, this might change in the future, and I would have to include all of them which will result in query JOINing many tables to return names of the users.

The GraphQL-like solution to this problem is to create a resolver for the roles field:

@FieldResolver(returns => [Role])
async roles(@Root() root: User) { ... }

First I had a mismatch in what type was returned by the GraphQL query and because the instances returned by sequelize do have field accessors and the result was as expected for the basic queries to get the name, emailAddress, etc.

The value of parameter root with the @Root annotation is provided by the GraphQL engine. I have two types representing the database entity User and the data transfer object UserDTO. The root here was initially the instance of the model class, i.e. entity loaded from the database, I thought it should be the instance of the DTO class, and it fits better in the world of GraphQL. What I am losing here is the model capability and the access to its properties, fields and association collections (which I should not have access to because I don’t want to over-fetch). But it should return DTO because I don’t know what end-user wants. Returning database entity may leak some information to end-user.

The correct declaration for the resolver for roles field should then be

async roles(@Root() root: User): Promise<RoleDTO[]> { ... }

I query the database to find the corresponding user entity for the root passed in by GraphQL engine.

@FieldResolver(returns => [RoleDto])
async roles(@Root() root: User): Promise<RoleDTO[]> {
  const user = await this.UsersProvider.getUserById(root.id); // fetch the user again
  if (user) {
    const roles = (await user.$get<Role>("roles")) as Role[]; // get roles for current user
    return roles.map<RoleDTO>(r => { 
      return {
        name: r.name,
        description: r.name
      };
   }); 
 } 

 return [];
}

Now, I have a different problem, the SELECT N+1 problem. This resolver will query the database to get the roles for every user that will be returned in the GraphQL query (not to mention the initial query to get the users):

Executing (default): SELECT id, userName, firstName, lastName, email AS emailAddress, confirmed, suspended, deleted FROM m_user AS User WHERE User.id = 15;

Executing (default): SELECT id, userName, firstName, lastName, email AS emailAddress, confirmed, suspended, deleted FROM m_user AS UserWHERE User.id = 76;

Executing (default): SELECT id, userName, firstName, lastName, email AS emailAddress, confirmed, suspended, deleted FROM m_user AS User WHERE User.id = 6;

Executing (default): SELECT id, userName, firstName, lastName, email AS emailAddress, confirmed, suspended, deleted FROM m_user AS User WHERE User.id = 9;          

...          

Executing (default): SELECT Role.id, Role.shortname AS name, UserRole.roleId AS UserRole.roleId, UserRole.userId AS UserRole.userId FROM m_role AS Role INNER JOIN m_role_assignments AS UserRole ON Role.id = UserRole.roleId AND UserRole.userId = 15;          
Executing (default): SELECT Role.id, Role.shortname AS name, UserRole.roleId AS UserRole.roleId, UserRole.userId AS UserRole.userId FROM m_role AS Role INNER JOIN m_role_assignments AS UserRole ON Role.id = UserRole.roleId AND UserRole.userId = 76;      
Executing (default): SELECT Role.id, Role.shortname AS name, UserRole.roleId AS UserRole.roleId, UserRole.userId AS UserRole.userId FROM m_role AS Role INNER JOIN m_role_assignments AS UserRole ON Role.id = UserRole.roleId AND UserRole.userId = 6;          

Executing (default): SELECT Role.id, Role.shortname AS name, UserRole.roleId AS UserRole.roleId, UserRole.userId AS UserRole.userId FROM m_role AS Role INNER JOIN m_role_assignments AS UserRole ON Role.id = UserRole.roleId AND UserRole.userId = 9;

The problem has an easy solution, and it is to use dataloader – in my case sequelize-dataloader since I am using Sequelize to access the database.

The dataloader hooks itself into the Sequelize methods (findByPk in this example) and hijacks them, replacing them with a smart caching mechanism. If you prime the dataloader context with objects and you later want to load an object by id from the database, then the dataloader will check its cache, and if the object is there, it will return it immediately thus avoiding database roundtrip.

import { createContext, EXPECTED_OPTIONS_KEY } from "dataloader-sequelize";

@Query(returns => UserQueryResult)
async users(
  @Args() input: UserSearchArgs,
  @Ctx() ctx: Context
): Promise<UserQueryResult> {
  const context = createContext(User.sequelize); // create dataloader context 
  const found = await this.usersProvider.getUsers(input); // get the users
  context.prime(found.records); // prime the context with found records
  ctx["dataloader-context"] = context; // remember the dataloader context in GraphQL context
  return {
    page: found.page,
    users: found.records.map(this.convert), 
    rowsPerPage: found.rowsPerPage,
    totalCount: found.totalCount
  };
}

And the roles resolver method:

@FieldResolver(returns => [RoleDTO])
async roles(@Root() root: UserDTO, @Ctx() ctx: Context): Promise<RoleDTO[]> {
  const context = ctx["dataloader-context"];
  const user: User = await User.findByPk(root.id, {
    attributes: {
      exclude: ["password", "password_confirmation", "password_digest"]
    },
    [EXPECTED_OPTIONS_KEY]: context // pass in the dataloader context
  });

  if (user) {
    const roles = (await user.$get<Role>("roles", { // nasty TypeScript workaround
      [EXPECTED_OPTIONS_KEY]: context // pass in the dataloader context
    })) as Role[];
    return roles.map<RoleDTO>(r => {
      return {
        id: r.id,
        name: r.name,
        description: r.name
      };
    });
  }
  throw new NotFoundError("User with given id was not found.");
}

Dataloader has another magic feature, it is batching similar database queries until the very last moment. Instead of loading the roles in N round trips for each user, only a single query is executed:

Executing (default): SELECT Role.id, Role.shortname AS name, UserRole.roleId AS UserRole.roleId, UserRole.userId AS UserRole.userId FROM m_role AS Role INNER JOIN m_role_assignments AS UserRole ON Role.id = UserRole.roleId AND UserRole.userId in (15, 76, 6, 9);

In total, only two queries are executed – one to get the users and one to get roles of all users from the previous query result set.

It works now, but the road to this point was quite bumpy. I have discussed the issues I have run into on GitHub with authors of the libraries I have used (for example here). I discussed the problems that only existed on my machine (and were results of my misunderstanding of how the library works), tried and failed miserably to fix some issues (here).

Always create bidirectional relations in your entities; the dataloader will like you more.

Despite the steep learning curve Typescript and GraphQL I will choose next time too. I don’t want to throw out everything I just learned 🙂

TypeScript build system for Sublime Text 2

Recently I decided to create a small Node.JS project. And I like strongly typed languages. I know I can’t get the full strong typing in JavaScript and having the power of it. So I decided to use TypeScript for the project. I chose Sublime Text 2 as an editor, I am used to use Visual Studio but in I wanted to take a grip of an editor that as I read “everyone loves”.

After downloading all the required bits I got to build (Ctrl+B) the project and ran into this error:

error TS5037: Cannot compile external modules unless the '--module' flag is provided.

This is easy to fix when you are building TypeScript project from command line:

tsc --module commonjs app.ts

This is the original typescript.sublime-build file I downloaded from the Internet:

{
  "cmd": ["c:\\Users\\user\\AppData\\Roaming\\npm\\tsc.cmd", "$file"],
  "file_regex": "(.*\\.ts?)\\s\\(([0-9]+)\\,([0-9]+)\\)\\:\\s(...*?)$",
  "selector": "app.ts"
}

and the fix is very simple (but it took me quite a while to figure it out) – just add new items ("--module" and "commonjs") to cmd array:

{
  "cmd": ["c:\\Users\\user\\AppData\\Roaming\\npm\\tsc.cmd", "--module", "commonjs", "$file"],
  "file_regex": "(.*\\.ts?)\\s*\\(([0-9]+)\\,([0-9]+)\\)",
  "selector": "app.ts"
}

I had to modify the file_regex too, because it was not working.