terraform/builtin/providers/test/resource.go

234 lines
5.4 KiB
Go
Raw Normal View History

package test
import (
"errors"
"fmt"
"github.com/hashicorp/terraform/helper/schema"
)
func testResource() *schema.Resource {
return &schema.Resource{
Create: testResourceCreate,
Read: testResourceRead,
Update: testResourceUpdate,
Delete: testResourceDelete,
Importer: &schema.ResourceImporter{
State: schema.ImportStatePassthrough,
},
CustomizeDiff: func(d *schema.ResourceDiff, _ interface{}) error {
if d.HasChange("optional") {
d.SetNewComputed("planned_computed")
}
return nil
},
Schema: map[string]*schema.Schema{
"required": {
Type: schema.TypeString,
Required: true,
},
"optional": {
Type: schema.TypeString,
Optional: true,
},
"optional_bool": {
helper/schema: Normalize bools to "true"/"false" in diffs For a long time now, the diff logic has relied on the behavior of `mapstructure.WeakDecode` to determine how various primitives are converted into strings. The `schema.DiffString` function is used for all primitive field types: TypeBool, TypeInt, TypeFloat, and TypeString. The `mapstructure` library's string representation of booleans is "0" and "1", which differs from `strconv.FormatBool`'s "false" and "true" (which is used in writing out boolean fields to the state). Because of this difference, diffs have long had the potential for cosmetically odd but semantically neutral output like: "true" => "1" "false" => "0" So long as `mapstructure.Decode` or `strconv.ParseBool` are used to interpret these strings, there's no functional problem. We had our first clear functional problem with #6005 and friends, where users noticed diffs like the above showing up unexpectedly and causing troubles when `ignore_changes` was in play. This particular bug occurs down in Terraform core's EvalIgnoreChanges. There, the diff is modified to account for ignored attributes, and special logic attempts to handle properly the situation where the ignored attribute was going to trigger a resource replacement. That logic relies on the string representations of the Old and New fields in the diff to be the same so that it filters properly. So therefore, we now get a bug when a diff includes `Old: "0", New: "false"` since the strings do not match, and `ignore_changes` is not properly handled. Here, we introduce `TypeBool`-specific normalizing into `finalizeDiff`. I spiked out a full `diffBool` function, but figuring out which pieces of `diffString` to duplicate there got hairy. This seemed like a simpler and more direct solution. Fixes #6005 (and potentially others!)
2016-05-05 16:00:58 +02:00
Type: schema.TypeBool,
Optional: true,
},
"optional_force_new": {
Type: schema.TypeString,
Optional: true,
ForceNew: true,
},
"optional_computed_map": {
Type: schema.TypeMap,
Optional: true,
Computed: true,
},
"optional_computed_force_new": {
Type: schema.TypeString,
Optional: true,
Computed: true,
ForceNew: true,
},
"optional_computed": {
Type: schema.TypeString,
Optional: true,
Computed: true,
},
"computed_read_only": {
Type: schema.TypeString,
Computed: true,
},
Stop requiring multi-vars (splats) to be in array brackets Prior to Terraform 0.7, lists in Terraform were just a shallow abstraction on top of strings with a magic delimiter between items. Wrapping a single string in brackets in the configuration was Terraform's prompt that it needed to split the string on that delimiter during interpolation. In 0.7, when first-class lists were added, this convention was preserved by flattening lists-of-lists by one level when they were encountered in configuration. However, there was an oversight in that change where it did not correctly handle the case where the inner list was unknown. In #14135 we removed some code that was flattening partially-unknown lists into fully-unknown (untyped) values. This inadvertently exposed the missed case from the previous paragraph, causing issues for list-wrapped splat expressions with unknown members. While this worked fine for resources, due to some fixup done inside helper/schema, this did not work for other interpolation contexts such as module blocks. Various attempts to fix this up and restore the flattening behavior selectively were unsuccessful, due to a proliferation of assumptions all over the core code that would be too risky to change just to fix this bug. This change, then, takes the different approach of removing the requirement that splats be presented inside list brackets. This requirement didn't make much sense anymore anyway, since no other list-returning expression had this constraint and so the rest of Terraform was already successfully dealing with both cases. This leaves us with two different scenarios: - For resource arguments, existing normalization code in helper/schema does its own flattening that preserves compatibility with the common practice of using bracketed splats. This change proves this with a test within the "test" provider that exercises the whole Terraform core and helper/schema stack that assigns bracketed splats to list and set attributes. - For arguments in other blocks, such as in module callsites, the interpolator's own flattening behavior applies to known lists, preserving compatibility with configurations from before partially-computed splats were possible, but those wishing to use partially-computed splats are required to drop the surrounding brackets. This is less concerning because this scenario was introduced only in 0.9.5, so the scope for breakage is limited to those who adopted this new feature quickly after upgrading. As of this commit, the recommendation is to stop using brackets around splats but the old form continues to be supported for backward compatibility. In a future _major_ version of Terraform we will probably phase out this legacy form to improve consistency, but for now both forms are acceptable at the expense of some (pre-existing) weird behavior when _actual_ lists-of-lists are used. This addresses #14521 by officially adopting the suggested workaround of dropping the brackets around the splat. However, it doesn't yet allow passing of a partially-unknown list between modules: that still violates assumptions in Terraform's core, so for the moment partially-unknown lists work only within a _single_ interpolation expression, and cannot be passed around between expressions. Until more holistic work is done to improve Terraform's type handling, passing a partially-unknown splat through to a module will result in a fully-unknown list emerging on the other side, just as was the case before #14135; this change just addresses the fact that this was failing with an error in 0.9.5.
2017-05-20 02:47:52 +02:00
"computed_from_required": {
Type: schema.TypeString,
Computed: true,
ForceNew: true,
},
"computed_read_only_force_new": {
Type: schema.TypeString,
Computed: true,
ForceNew: true,
},
"computed_list": {
Type: schema.TypeList,
Computed: true,
Elem: &schema.Schema{
Type: schema.TypeString,
},
},
"set": {
Type: schema.TypeSet,
Optional: true,
Elem: &schema.Schema{
Type: schema.TypeString,
},
Set: schema.HashString,
},
"computed_set": {
Type: schema.TypeSet,
Computed: true,
Elem: &schema.Schema{
Type: schema.TypeString,
},
Set: schema.HashString,
},
"map": {
Type: schema.TypeMap,
Optional: true,
},
"optional_map": {
Type: schema.TypeMap,
Optional: true,
},
"required_map": {
Type: schema.TypeMap,
Required: true,
},
"map_that_look_like_set": {
Type: schema.TypeMap,
Optional: true,
Elem: &schema.Schema{
Type: schema.TypeString,
},
},
"computed_map": {
Type: schema.TypeMap,
Computed: true,
},
"list": {
Type: schema.TypeList,
Optional: true,
Elem: &schema.Schema{
Type: schema.TypeString,
},
},
"list_of_map": {
Type: schema.TypeList,
Optional: true,
Elem: &schema.Schema{
Type: schema.TypeMap,
Elem: &schema.Schema{
Type: schema.TypeString,
},
},
},
"apply_error": {
Type: schema.TypeString,
Optional: true,
Description: "return and error during apply",
},
"planned_computed": {
Type: schema.TypeString,
Computed: true,
Description: "copied the required field during apply, and plans computed when changed",
},
// this should return unset from GetOkExists
"get_ok_exists_false": {
Type: schema.TypeBool,
Computed: true,
Optional: true,
Description: "do not set in config",
},
"int": {
Type: schema.TypeInt,
Optional: true,
},
},
}
}
func testResourceCreate(d *schema.ResourceData, meta interface{}) error {
d.SetId("testId")
errMsg, _ := d.Get("apply_error").(string)
if errMsg != "" {
return errors.New(errMsg)
}
// Required must make it through to Create
if _, ok := d.GetOk("required"); !ok {
return fmt.Errorf("Missing attribute 'required', but it's required!")
}
if _, ok := d.GetOk("required_map"); !ok {
return fmt.Errorf("Missing attribute 'required_map', but it's required!")
}
Stop requiring multi-vars (splats) to be in array brackets Prior to Terraform 0.7, lists in Terraform were just a shallow abstraction on top of strings with a magic delimiter between items. Wrapping a single string in brackets in the configuration was Terraform's prompt that it needed to split the string on that delimiter during interpolation. In 0.7, when first-class lists were added, this convention was preserved by flattening lists-of-lists by one level when they were encountered in configuration. However, there was an oversight in that change where it did not correctly handle the case where the inner list was unknown. In #14135 we removed some code that was flattening partially-unknown lists into fully-unknown (untyped) values. This inadvertently exposed the missed case from the previous paragraph, causing issues for list-wrapped splat expressions with unknown members. While this worked fine for resources, due to some fixup done inside helper/schema, this did not work for other interpolation contexts such as module blocks. Various attempts to fix this up and restore the flattening behavior selectively were unsuccessful, due to a proliferation of assumptions all over the core code that would be too risky to change just to fix this bug. This change, then, takes the different approach of removing the requirement that splats be presented inside list brackets. This requirement didn't make much sense anymore anyway, since no other list-returning expression had this constraint and so the rest of Terraform was already successfully dealing with both cases. This leaves us with two different scenarios: - For resource arguments, existing normalization code in helper/schema does its own flattening that preserves compatibility with the common practice of using bracketed splats. This change proves this with a test within the "test" provider that exercises the whole Terraform core and helper/schema stack that assigns bracketed splats to list and set attributes. - For arguments in other blocks, such as in module callsites, the interpolator's own flattening behavior applies to known lists, preserving compatibility with configurations from before partially-computed splats were possible, but those wishing to use partially-computed splats are required to drop the surrounding brackets. This is less concerning because this scenario was introduced only in 0.9.5, so the scope for breakage is limited to those who adopted this new feature quickly after upgrading. As of this commit, the recommendation is to stop using brackets around splats but the old form continues to be supported for backward compatibility. In a future _major_ version of Terraform we will probably phase out this legacy form to improve consistency, but for now both forms are acceptable at the expense of some (pre-existing) weird behavior when _actual_ lists-of-lists are used. This addresses #14521 by officially adopting the suggested workaround of dropping the brackets around the splat. However, it doesn't yet allow passing of a partially-unknown list between modules: that still violates assumptions in Terraform's core, so for the moment partially-unknown lists work only within a _single_ interpolation expression, and cannot be passed around between expressions. Until more holistic work is done to improve Terraform's type handling, passing a partially-unknown splat through to a module will result in a fully-unknown list emerging on the other side, just as was the case before #14135; this change just addresses the fact that this was failing with an error in 0.9.5.
2017-05-20 02:47:52 +02:00
d.Set("computed_from_required", d.Get("required"))
return testResourceRead(d, meta)
}
func testResourceRead(d *schema.ResourceData, meta interface{}) error {
d.Set("computed_read_only", "value_from_api")
d.Set("computed_read_only_force_new", "value_from_api")
if _, ok := d.GetOk("optional_computed_map"); !ok {
d.Set("optional_computed_map", map[string]string{})
}
d.Set("computed_map", map[string]string{"key1": "value1"})
d.Set("computed_list", []string{"listval1", "listval2"})
d.Set("computed_set", []string{"setval1", "setval2"})
d.Set("planned_computed", d.Get("optional"))
// if there is no "set" value, erroneously set it to an empty set. This
// might change a null value to an empty set, but we should be able to
// ignore that.
s := d.Get("set")
if s == nil || s.(*schema.Set).Len() == 0 {
d.Set("set", []interface{}{})
}
// This mimics many providers always setting a *string value.
// The existing behavior is that this will appear in the state as an empty
// string, which we have to maintain.
o := d.Get("optional")
if o == "" {
d.Set("optional", nil)
}
// This should not show as set unless it's set in the config
_, ok := d.GetOkExists("get_ok_exists_false")
if ok {
return errors.New("get_ok_exists_false should not be set")
}
return nil
}
func testResourceUpdate(d *schema.ResourceData, meta interface{}) error {
errMsg, _ := d.Get("apply_error").(string)
if errMsg != "" {
return errors.New(errMsg)
}
return testResourceRead(d, meta)
}
func testResourceDelete(d *schema.ResourceData, meta interface{}) error {
d.SetId("")
return nil
}