Schnittstelle LocalTimeArbitrary

Alle Superschnittstellen:
Arbitrary<LocalTime>

@API(status=EXPERIMENTAL, since="1.5.1") public interface LocalTimeArbitrary extends Arbitrary<LocalTime>
Fluent interface to configure the generation of local time values.
  • Methodendetails

    • between

      default LocalTimeArbitrary between(LocalTime min, LocalTime max)
      Set the allowed lower min (included) and upper max (included) bounder of generated local time values. If you don't explicitly set the precision and use min/max values with precision milliseconds/microseconds/nanoseconds, the precision of your min/max value is implicitly set.
    • atTheEarliest

      LocalTimeArbitrary atTheEarliest(LocalTime min)
      Set the allowed lower min (included) bounder of generated local time values. If you don't explicitly set the precision and use min/max values with precision milliseconds/microseconds/nanoseconds, the precision of your min/max value is implicitly set.
    • atTheLatest

      LocalTimeArbitrary atTheLatest(LocalTime max)
      Set the allowed upper max (included) bounder of generated local time values. If you don't explicitly set the precision and use min/max values with precision milliseconds/microseconds/nanoseconds, the precision of your min/max value is implicitly set.
    • hourBetween

      LocalTimeArbitrary hourBetween(int min, int max)
      Set the allowed lower min (included) and upper max (included) bounder of generated hour values. The hours can be between 0 and 23.
    • minuteBetween

      LocalTimeArbitrary minuteBetween(int min, int max)
      Set the allowed lower min (included) and upper max (included) bounder of generated minute values. The minutes can be between 0 and 59.
    • secondBetween

      LocalTimeArbitrary secondBetween(int min, int max)
      Set the allowed lower min (included) and upper max (included) bounder of generated second values. The minutes can be between 0 and 59.
    • ofPrecision

      LocalTimeArbitrary ofPrecision(ChronoUnit ofPrecision)
      Constrain the precision of generated values. Default value: Seconds. If you don't explicitly set the precision and use min/max values with precision milliseconds/microseconds/nanoseconds, the precision of your min/max value is implicitly set.