We are excited to announce the general availability of Constraints, a feature that gives customers more fine-grained control of an experiment's parameter space. Now, a subset of the continuous parameters in an experiment can be subject to a linear constraint to restrict the parameter space that SigOpt searches. It is useful for scenarios where known interdependencies between parameters mean only a region of the parameter space is valid.
Today, we are excited to announce SigOpt Organizations, the next step in the evolution of our web dashboard. As the latest improvement to our web dashboard, Organizations is designed to help larger customers with multiple modelling teams control user access and roll up cross-team usage insights. The result is a more seamless experience for every user, whether it be the chief innovation officer, head of data science, or data scientist.
Today, we’re happy to announce a strategic investment and technology development agreement with In-Q-Tel (IQT).
IQT is a non-profit, strategic investor that helps accelerate the development and delivery of cutting-edge technologies to U.S. government agencies that keep our nation safe. Established in 1999, IQT has the goal of identifying and partnering with startups that are developing innovative technologies that can better protect and preserve the United States’ security. We’re proud to partner with IQT to bring SigOpt’s optimization-as-a-service solution to our nation’s government agencies.
The research team at SigOpt works to provide the best Bayesian optimization platform for our customers. In our spare time, we also engage in research projects in a variety of other fields. This blog post highlights one of those recent projects which will be presented Tuesday February 6 at AAAI 2018. For those who cannot attend that session, we discuss here the topic of embeddings of vectors and the computational gains available when using circulant binary embeddings.
2017 was a great year for SigOpt. We were selected by a number of the leading firms in algorithmic trading, banking, and technology to improve their optimization processes. We announced a number of strategic partnerships with companies like AWS, Intel AI, NVIDIA, and SAP. We won a number of industry awards, such as the Barclays Innovation Challenge Award and the CB Insights AI 100, and were recognized as a Gartner Cool Vendor in AI. We are looking forward to an even better 2018!
Last month, we announced the availability of SigOpt on AWS Marketplace, which allows data scientists and researchers to deploy SigOpt on AWS with the click of a button. Today, we’re doubling down on our collaboration with AWS through a new feature called AWS PrivateLink.
AWS PrivateLink is a new solution that will enable SigOpt to connect directly with any AWS customer that has an Amazon Virtual Private Cloud (VPC). Amazon VPC is a private cloud-based network organizations can leverage to provision an isolated section of the AWS cloud to launch AWS resources.
It's been a busy few months at SigOpt. We've developed and launched new features like Multimetric optimization and Linear Constraints, teamed up with AWS and NVIDIA to publish blog posts, published our research at ICML in August, and more. Read the first edition of our quarterly newsletter to learn about what the team has been working on!
There’s no question that Amazon Web Services (AWS) has played an instrumental role in technology today. AWS has reduced the historically large equipment costs required to build and scale technology—like servers, cables, hard drives, and power supplies—allowing entrepreneurs and software engineers to reap the benefits of the cloud. This is why it is with great pleasure that today we announce the availability of SigOpt on AWS Marketplace.
Here at SigOpt, Gaussian processes and reproducing kernel Hilbert spaces (RKHS) are important components of our Bayesian optimization methodology. Our research into this topic often exists at the intersection of approximation theory and spatial statistics. We recently attended the SIAM Computational Science and Engineering 2017 meeting in Atlanta to continue learning about outstanding accomplishments in these and other exciting topics in computation.
In addition to participating in the career fair, we also organized a minisymposium to bring together experts in Gaussian processes and RKHS and discuss recent advances that are relevant to Bayesian optimization. First, we would like to extend our thanks to Jian Wu, Jie Chen and Greg Fasshauer for contributing talks to the session; their presentations provided a diverse set of insights into new strategies for improving the viability and performance of RKHS theory in Bayesian optimization. This post serves to discuss our ongoing collaboration with Dr. Fasshauer and explore the topic of his talk at a level suitable for a broad audience.
In-depth optimization research